Huge dataset load and calculation issue
So basically I am working on a new cloud application which will need to load huge datasets(millions of records) and carry out multiple calculations on those datasets in the service layer. Now I am trying to research possible avenues we can take.
Here is the use case
- User opens web
- User initiates a service call
- service loads all the data from the db
- service carries out calculations and then generates a resultset
- User may need to rerun the calculations if results are not as expected
So that creates many issues.
Is there a way I can compress the data coming from SQL server so that I don't overload the service memory in the case of lots of users using it?
This service may be used by multiple people at the same time, will load balancing suffice for this?
Is there a way I can maintain the state after the calculation is complete, i.e. maintain the data in the service in case the user wants to rerun?
Is what we are proposing better architectured another way?
Any help is greatly appreciated.
Thanks
0 comments:
Post a Comment