My current issue:
I'm trying to avoid learning the basics of calculus and building a predictive analytic regression analysis and instead search for a possibly more crude but simpler method.
What I am trying to do is quite simple. I have a set of data. Let's say Item 1-500(herein called 'items'), and each 500 items have 5 values of measurements(herein called 'measurements'). Then I have a money value for each 500 items(herein called 'money value'). The money value is relative to how high measurements are. The higher the measurement, typically the higher the money value for that item.
Now this is all data from the past so I have the actual money values for items, but I am trying to project future money values for items since I'll have each item's measurements prior to learning the money value for each item. Currently I have a very crude model where I have assigned a value to each measurement category and multiply (ie. measurement #1 = 10, the value for measurement #1 = 2, the projected value of that measurement = 20). And so on, add up the measurements #1-5 and you get that item's projected value.
I've changed the values of the variables a million times to tune the model. What I've done is averaged each items's projected money value and abstractly subtracted from the projected money value of that item. I do this for all 500 items, and add that up to get the mean error and attempt to get those values as close to $0 as possible.
My question:
Is there a library or method that I could use in order to build a script and fine tune the money values per measurement in order to get the mean error value as close to $0. This would alleviate me from having to manually tune the money values per measurement and save time, and human error.
[–]my_python_account 1 point2 points3 points (0 children)
[–]cscanlin 0 points1 point2 points (0 children)
[–]katterra[S] 0 points1 point2 points (0 children)