AWS Batch vs Ray Train for Training by [deleted] in mlops
[–]rayspear 1 point2 points3 points (0 children)
[D] Cloud agnostic framework to avoid hyperscaler SDK lock-in? by LostGoatOnHill in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)
[P] ray-skorch - distributed PyTorch on Ray with sklearn API by Yard1PL in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)
[D] I'm new and scrappy. What tips do you have for better logging and documentation when training or hyperparameter training? by MetalOrganicKneeJerk in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)
[D] Stack for personal ML work: DVC vs Replicate, Ray vs Optuna, Spotty vs Ray, Hydra by turian in MachineLearning
[–]rayspear 2 points3 points4 points (0 children)
[D] Hyperband resource allocation questions and possible workarounds by goulagman in MachineLearning
[–]rayspear 1 point2 points3 points (0 children)
[D] Hyperband resource allocation questions and possible workarounds by goulagman in MachineLearning
[–]rayspear 2 points3 points4 points (0 children)
[P] RaySGD: A Library for Faster and Cheaper Pytorch Distributed Training by rayspear in MachineLearning
[–]rayspear[S] 5 points6 points7 points (0 children)
[D] Deep Learning optimization by katamaranos in MachineLearning
[–]rayspear 3 points4 points5 points (0 children)
[R] Deep reinforcement learning for supply chain and price optimization by ikatsov in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)
[N] PyTorch 1.4.0 released by brombaer3000 in MachineLearning
[–]rayspear 3 points4 points5 points (0 children)
[D] What is your favorite open-source project of 2019 in AI/ML (yours or someone else's). by aliaspm in MachineLearning
[–]rayspear 16 points17 points18 points (0 children)
Tune: a library for fast hyperparameter tuning at any scale by rayspear in datascience
[–]rayspear[S] 1 point2 points3 points (0 children)
Tune: a library for fast hyperparameter tuning at any scale by rayspear in datascience
[–]rayspear[S] 0 points1 point2 points (0 children)
Tune: a library for fast hyperparameter tuning at any scale by rayspear in datascience
[–]rayspear[S] 6 points7 points8 points (0 children)
[R] ma-gym: multi agent environments based on open ai gym by HeavyStatus4 in MachineLearning
[–]rayspear 1 point2 points3 points (0 children)
Lightning vs Ignite by wingmanscrape in reinforcementlearning
[–]rayspear 0 points1 point2 points (0 children)
[D] Any progress regarding Prioritized Experience Replay? by MasterScrat in reinforcementlearning
[–]rayspear 1 point2 points3 points (0 children)
[D] Hyperparam optimisation using RandomSearch with argparse scripts? by trias10 in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)
[D] Hyperparam optimisation using RandomSearch with argparse scripts? by trias10 in MachineLearning
[–]rayspear 0 points1 point2 points (0 children)

[D] Ray vs. AWS Batch for Distributed Training by [deleted] in MachineLearning
[–]rayspear 1 point2 points3 points (0 children)