Designers often tend to talk about the pre- and the post-iPod era. Touch sensitive controllers, minimized into one jog wheel that intuitively allows all control you need to use a mobile music device became the role model. A blueprint of how content has to be accessible to the user on a slick device (even though the iPod has some control bugs).
Tomorrow you will not see many keyboard or knob based interfaces any more. It’s all becoming gestures, multitouch, intuitive, customizability.
The crucial question is, how will content like music be consumed and ‘handled’ in a touchscreen environment?