stored procedures - Which approach promises better performance - a megaquery or several targeted queries? -


i creating ssrs report returns data several "units", displayed on row, unit 1 first, right unit 2 data, etc.

i can either data using stored proc queries database using "in" clause, or multiple targeted ("unit = bla") queries.

so i'm thinking can either filter each "unit" segment "=unit:[unit1]" or can assign different dataset each segment (witht targeted data).

which way more "performant" - getting big chunk of data, , filtering same thing in various locations, or getting several instances/datasets of targeted data?

my guess latter, don't know if maybe ssrs smart enough make former approach work or better doing optimizing "behind scenes"

i think depends on how big big chunk of data is. experience has been ssrs can process quite large amount of data after comes database, , quickly. if report going aggregate data in end, try of can on database end. reason, database server has more resources work. but, if detail needed, , can aggregate on report server end enough, pull 10k records , it.

i lean toward hitting database few times possible, makes sense data need individual queries. have built reports on 20 datasets, each specific measures didn’t union well. breaking took report run time 3 minutes, 20 seconds.

not great answer if looking exact solution go with. depends on situation. often, trial , error gets answer report in question.


Popular posts from this blog

php - How should I create my API for mobile applications (Needs Authentication) -

5 Reasons to Blog Anonymously (and 5 Reasons Not To)

Google AdWords and AdSense - A Dynamic Small Business Marketing Duo