Discussion:
[bottlepy] question: shall I keep a model as global variable in a machine learning application?
neverfly
2014-10-30 14:26:54 UTC
Permalink
I am a newbie to bottlepy although I love its beauty at the first glance.

The problem comes when I use bottle to construct a machine learning
application. The application relies on an existing "machine learning"
model, and serve different users: every user may provide some inputs, and
the big model will reply these inputs. (To improve performance, we also
want to update the model using some user inputs although currently we
haven't implemented this function yet)

My question is, how to keep the "model" throughout the application. The
only solution I have now is to keep it as a global variable at the server
since it will be very time consuming to load the model from disk for every
user. But I am afraid global is not the best option. So I am posting here
and looking for advice.

Best
neverfly.
--
--
You are member of the "bottlepy" group at google groups.
See http://groups.google.de/group/bottlepy for mailing list options.
See http://bottlepy.org/ for news and documentation.

---
You received this message because you are subscribed to the Google Groups "bottlepy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bottlepy+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
fackler
2014-10-31 16:20:32 UTC
Permalink
There are probably tons of ways to accomplish what you want to do. It goes
without saying that allowing user input to affect a model definition rather
than just it's property values can be risky. Do what you must to keep it
sane and safe.

I think if I was going to attempt something like this, I'd do it by way of
a class factory. Then you could use the factory to load all previous
iterative changes, if that's something you needed, from a datastore like
sqlite in memory mode if you don't wanna hit the disk. The factory could
then instance a model and apply the changes to the instance before
returning it. You could roll up all your specific requirements for machine
learning inside the factory which would keep the original base model in
tact. You could even take in a user parameter and return model instances
with changes specific to a user(s).
Post by neverfly
I am a newbie to bottlepy although I love its beauty at the first glance.
The problem comes when I use bottle to construct a machine learning
application. The application relies on an existing "machine learning"
model, and serve different users: every user may provide some inputs, and
the big model will reply these inputs. (To improve performance, we also
want to update the model using some user inputs although currently we
haven't implemented this function yet)
My question is, how to keep the "model" throughout the application. The
only solution I have now is to keep it as a global variable at the server
since it will be very time consuming to load the model from disk for every
user. But I am afraid global is not the best option. So I am posting here
and looking for advice.
Best
neverfly.
--
--
You are member of the "bottlepy" group at google groups.
See http://groups.google.de/group/bottlepy for mailing list options.
See http://bottlepy.org/ for news and documentation.

---
You received this message because you are subscribed to the Google Groups "bottlepy" group.
To unsubscribe from this group and stop receiving emails from it, send an email to bottlepy+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...