algorithm - How can I correlate pageviews with memory spikes? -
I am having some memory problems with an application, but in reality it is difficult to find out where it is. I have two sets of data:
Pageviews
- The page was requested
- At that time The page was requested
Memory usage
- The amount of memory being used
- The time of use of this memory is recorded
I would like to see which pageviews are related to high memory usage, I guess I want to determine which page Rtdrishy has been correlated with increased memory usage, I'm a t-test of some sort. However, I am somewhat uncertain about what kind of T-test to go with. Can someone tell me in the right direction?
I suggest creating a dataset with two columns. Firstly the highest memory distribution will be the ratio of each page to display, and the second will be the ratio of those (similar) pages to the remaining values of memory distribution.
Then you will have to check for a couple that the average difference (high-rest) of the difference is less or equal to zero (H), unlike the alternative hypothesis, the difference difference is more than the average I recommend using the non-parametric test Wilcoxon cyanide rank test
which is different for the sample having texture of value - whitney test
I will also vary the level of difference in each pair Tend to ignore some other tests maintains that the vehicle (such as sign test).
Keep in mind that relations (zero differences) present many problems in the derivatives of nonparametic methods and should be avoided. The better way to deal with relationships is to add a little "noise" to the data. That is, after modifying the bound values by adding a small enough random variable, complete the test, which will not affect the ranking of the difference
I hope that in the conspiracy to test results and differences distribution Insight will be where the problem is.
It is implemented in
Comments
Post a Comment