At this time of the year, as we count our blessings and give thanks, I suddenly found myself thinking: "Thanks for the learning experiences." I find real delight in learning and exploring, and I'm sure many of you do, too. Learning is interwoven with the challenges that we encounter as IT professionals, no doubt. We have to learn, learn and learn to do our work well.
If we replace "run" with "learn" in the famous quote from Through The Looking Glass, it would sound just right to what we have to deal with:
"A slow sort of country!" said the Queen, "Now, here, you see, it takes all the learning you can do, to keep in the same place. If you want to get somewhere else, you must learn at least twice as fast as that!"
Let's now consider what people normally do to learn. Do they gauge the mainstream education techniques with their individual needs? Apparently, no. Well-intentioned corporate managers send employees to conferences believing that's the best thing they can do to help people grow professionally. Have you ever been to a conference or a training, wanting to learn something new, only to find out that you've spent several days listening to truisms? The real outcome of such an event might be the feeling that you're not alone, that you've mingled with people who have similar problems. Unfortunately, there's no such takeaway from a conference as a solution to a real problem. That's the bottomline: we often mistake studying for learning.
Most of the conferences and educational events are focused on making people feel that they study, rather than having them learn meaningful things.
What are the signs that someone is studying? It's usually rote memorization, passively absorbing a lecture, doing abstract assignments. Learning is about taking lessons from your own challenges, both personal and professional, exploring and finding the answers to your own questions.
This awkward feeling about conferences has its roots in the wrong belief that techniques used for educating kids and teens should be copy-pasted to professionals. This grave misconception costs thousands of wasted hours that adults could use for real learning. Studying does work well for kids and teens, no question. Youngsters have to acquire some boilerplate knowledge to get started. Rote learning is great if a kid has to memorize the multiplication table. It's all different for adults. Mature brains and mature judgement excel at recognizing and enhancing the patterns they internalized from life experiences, that's why adults are even biologically less prone to memorizing things they don't really need. That's why we skip information if it has nothing to do with what we're currently preoccupied with.
Instead of taking a formal stance and spending corporate budgets on conferences, executives would be much better off encouraging the culture of on-site learning. If I need to learn something — I will learn it. That's how adults think. Another good scenario for learning is when one becomes genuinely interested in some subject, and wants to dig deep into it. This is what I call self-powered learning and exploring. It's a lot more exciting and rewarding (and I speak from my own experience here), than sitting in at a conference. I also find that mutual mentoring (or a peer-to-peer exchange of insights, thoughts and opinions) between professionals is a great way to learn. That's where conferences might actually help, as I've mentioned above. They are not that hopeless, after all 🙂
We need to be clear about what we expect from such events. We can not afford going to a conference that only touches a tip of an iceberg of what our real problem is. Do we want to socialize with the like-minded, to hear their stories, and share something with them? Or, are we taking this conference just formally, as a sit-in event? If yes, why not spend this time on an intense research, to get some real clues to what you want to achieve?