Keras broke the ice for me. The design of NNs used to take me a while to understand. It felt mechanic and meaningless. I was struggling hard to understand why adding or subtracting layers would help or hurt my models. I was trudging through tf documentation and honestly… I was very close to giving up.
I built my first ANN, got better with keras, graduated to tf, built my first U-net and got more confidence. I think anyone that really criticizes keras doesn’t understand that it is like criticizing training wheels for a bike.
You gotta learn to walk before you can run. You gotta learn baby nets before you are building monster segmentation models on recurrent convolutional neural nets. It takes time to understand the concepts and data flow.
So, either you have not recently taught a kid to ride a bike or you are just trolling.
So, I will counter your high ceiling with the low floor plan. The more a person rides a bike tw’s or not the better they will be at riding a bike. The tw’s get you riding more often and logging in the hours.
You may be right about balance being a skill you develop without tw’s but the hours they will spend failing and falling down discourages the kids then they don’t want to play anymore.