You could simply have the DB Poller poll the whole day long for changes that result from the batch. Only after the batch is run, or when the batch is running it will pickup messages.
But you could also have it poll a view that only under certain conditions (system datetime between certain times of day) deliver rows. But I don't see the point in that.
In my opinion there is no point in bringing down the poller. You should define your adapter as such that it only selects rows that are produced for the batch and that have a state that denote that the row can be processed even when the batch is processing other rows.
I guess you might fear processing entities in that have multiple updating rows spread out through the batch. For instance mulitple updates for a specific customer, order, etc.
You should find out if you can know that the batch job has processed all the rows for that specific customer.
You could for instance define a new table that gathers all the id's of entities that are 'touched' by the batch process. You could join that table to a table that holds the running and finished batchjob id's. The running job has a state of Running, all the finished jobs have a state of finished.
Now create a view on the first table that only selects rows for with state 'Initial' (or whatever state you define for rows that are ready to submit) and where the job, defined by jobid, has the state 'Finished'. Then only if you update the state of the running batch job to finished, the view will deliver rows for the DB Adapter.
Base your polling adapter on that view.