This Wired article, while it's talking about using the "flipped classroom" format for students, easily can be applied to adult learning as well. The biggest drawback to asking people to do the information transfer first, then use face-to-face time to apply it or deepen the knowledge, is just what one of the commenters points out: That people won't do their homework and so are unprepared to take advantage of applying that information so it really becomes learning. I've seen that happen over and over at conferences—well, OK, not that much, since so few people ask participants to actually invest time in their learning ahead of a conference—but the few times I've experienced it, pretty much no one did it. So the educators had to turn themselves into presenters after all.
Why is that the case? I think part of it is that we've become accustomed to being very lazy learners. It's so easy to just sit and listen to a lecture; not so easy to find an hour before you leave to watch that video, think about it, find ways you'd like to apply it, etc. And then go and roll our sleeves up and work through an actual application of some kind. After all, doesn't that make a whole lot more sense than traveling a thousand miles to learn what you could have Googled from the comfort of your desktop, then going home and trying to figure out on your own how to make that info work for your specific challenges? That's a whole lot harder, which is also probably a significant factor in why 95 percent of most of the info learned at conferences, in my experience, never get put into practice.
Though we all tend to say we go to conferences to learn (and network), I can't help but wonder how many of us, if given the option to turbo-charge the info-dump into a true learning experience, would think it worth the effort involved? And if we don't, why are we surprised that kids don't, either?