Thanks, basically a report is generated. I want to develop a process that will import the file's data and run checks. Eg filter for duplicates, run sql query etc. Do you think this is feasible for a collection this size? I don't think I can use oldeb as I need to read the report data?For such huge data, Data Oledb VBO is useful. Btw, why do you want to get such huge data in to collection?
A macro could in theory work, however it would be difficult to track exceptionsHow about an excel macro?
Yes, that is huge data in to collection.
I was hoping Blueprism could handle this but thinking it's probably going to cause an issue, I think I may be able to develop a macro to voer the large data side and then potentially load in a thousand records to do various checks on.Write the exceptions to a file and read that in to a collection for BP to continue the processing.
I know this is easy to sit on the other end and throw ideas but it could be an option to fallback on.