I haven been working with R interfacing my Oracle DB using the DBI package. I read that preparing a query is often a good practice when trying to query the same statment different times.
My question is, assuming infinite RAM to accomodate for the data downloaded, which factors may influence in the different run times between two scenarios: running a prepared query N times or using a WHERE ... BETWEEN filter?
Let's say I have to run a query to analyze some time series information between 2012 and 2018. I have found different download times between running a prepared query for each month between my analysis window and just filtering the whole window.
It depends on how the database optimizes your query. Maybe it chooses to optimize with an index when selecting just a single month, maybe it chooses to use a full table scan to retrieve the whole window at once.
Usually I would expect the query to retrieve the entire dataset at once to be more efficient than breaking it up in several parts per month.
Factors that play a role are, among others:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.