Well - you are executing a query against a table, so it's going to have some performance impact of course because you're using resources. That's only the same as when you execute ANY query on your system though. However it depends on how they are planning on doing that. If you join all those tables together on the R12 instance, you're using resources to do those hash joins etc. ETL's typically will take the tables as-is, and dump them into the warehouse (or otherwise). That is generally pretty efficient - you're just pulling rows from the table. then on the target instance you do the data transformations into your warehouse structure - utilizing the power of the warehouse which is designed for OLAP analytical style processes rather than OLTP transactional processes. That's why it's called E-LT now - Extract - Load - Transform.
As for it being against Oracle standards - which standards are they? It's not supported to UPDATE base tables, but querying - I've not seen any such constraints.
If I were you I'd be asking for them to explain where they envisage the performance issue to arise from - and to what degree. People too often chuck statements around like "it'll affect performance" without any justifiable foundations whatsoever. your database is designed to be multi-user - you're running outside of business hours - ask them to justify it! Do it in a development instance and monitor for any resource utilization spikes. If it's done right, you should see very little for something as simple as lifting a few GL tables.
In any case, if reading the data is going to give performance issues, how the hell do they expect to get access to it at all!??
The only thing I'd say - is create a dedicated database user for this and grant select only on the tables you require from apps to that user. I.e. don't connect as the apps user.