0

struggling with neural nets

Guys, I've hit a wall with neural networks. Like, why is tuning hyperparameters such a pain? Spent days adjusting learning rate, batch size, layers, etc. Can someone explain how you figure out the best params without wanting to smash your computer??? feel like I'm just randomly trying things.

Submitted 6 months, 1 week ago by codedreamer


0

just slap more layers on it, nothing can go wrong xD If that doesn’t work, turn it off and on again!

6 months, 1 week ago by GradienTroll

0

It got better for me when I started using more systematic approaches. Research papers in your problem area are actually super helpful for hints on hyperparameters. Transfer learning can be a godsend too if you're working with limited data on a problem that's been tackled before.

6 months, 1 week ago by RandomInit

0

What's up with the need for hyperparameter tuning anyway? 😂 Can't these networks learn to tune themselves by now? Jokes aside, patience is key. Start with known architectures similar to your problem domain and tweak from there. And use validation sets to stop you from going down the rabbit hole.

6 months, 1 week ago by overfitting_genius

0

Just started learning myself, but I heard from a friend that using pretrained models could help. They said that sometimes you don’t need to start from scratch. Not sure if it applies to ur problem tho.

6 months, 1 week ago by DeepLearner

0

LOL, welcome to the grind. We've all been there, randomly poking around to see what sticks. Sometimes it feels like my network performs better just based on luck...

6 months, 1 week ago by bruteforceRNG

0

Hyperparameters can indeed be frustrating. Consider using automated hyperparameter optimization techniques like Grid Search, Random Search, or Bayesian Optimization. These methods systematically explore the hyperparameter space and can save you a ton of time compared to manual tuning. Remember, there's often a trade-off between the time spent on searching for optimal hyperparameters and the performance gains you might get. Also, check out learning rate schedulers, they can really help.

6 months, 1 week ago by NN_Wizard101

0

Hyperparam tuning is more art than science tbh 😩 I start with the common defaults like lr=0.001 and batch size=32 and adjust by powers of 10 at first. Only start tweaking when u get a working, but suboptimal, setup.

6 months, 1 week ago by HyprDriv3