by Ken Trufitt - Professional Training Institute
The following observations flowed from an illuminating discussion I had recently with my fellow trainer, Jim.
For the past few years, we have been training in IBM mainframe technology. We could prattle off a long list of topics that had to be covered in our mainframe courses; DB2, SQL, QMF, TSO, ISPF, JCL, MVS, VTAM, VSAM, CICS ...ad nauseam. Just alphabet soup to the average punter.
With less development taking place on the mainframe and more on the PC platform, we made a strategic decision to move into the world of PC development tools, both as a matter of economic survival and because Jim is basically a propeller-head and loves these little machines anyway.
At the beginning it was fun and not too demanding. But things change, new technologies emerge and the ground shifts continuously. The battle to keep up to speed in a huge range of development products became ever more demanding. Not only are there many products available, but their complexity is astounding. Witness Excel 5's implementation of the Visual Basic for Applications language.
Both Jim and myself like to appear at least reasonably competent in the products we are training in. As both trainers and developers, we are continuously blown away by the great versatility, power and features that the latest software tools provide, no doubt due to the flourishing, competitive environment in which software is produced these days.
The problem is, things are changing just too fast! A very few years ago, you could spend six months getting up to speed in a certain development environment, then look forward to several years where you could stay at the leading edge with only a modest amount of reading to keep up to date with the latest developments.
Today, any environment that is not upgraded annually is considered to be languishing in the wings. And we don't mean minor upgrades, either. VB 2.0 introduced ODBC; barely six months later VB 3.0 introduced the JET engine. Before this year is out, VB 4.0 will see significant changes in the way OLE is implemented. And that's just VB. Try keeping up to date with C/C++ (including MFC and OWL)!
It is not a matter of the ability of our brains to store this information. The problem is one of the rate at which it must be absorbed.
We can draw an analogy between the process of learning new technical information to the process of sending data over a communications channel.
If the data being sent is totally predictable to the receiver, then no new information is being sent. Conversely, if the data being sent is entirely unpredictable, then only new information is going to be received.
We are all told that we use only 10% of our brain's capacity (although I would like to see how "they" arrived at this figure — Ed.). With some exceptions, and ignoring alcohol abuse and aging effects, the average person would have a fairly constant ability to absorb and process information over their working lives. The amount of new data that can be absorbed by a person over a certain time frame, could be likened to the bandwidth of the communications channel.
Any communications channel has a physical limit as to how much data can be meaningfully transmitted across it. Similarly, we humans may have a limit as to how much data we can readily absorb and process in a given time. It doesn't matter how much we might tinker with the brain's capacity to store and process information; the bandwidth of the input channel to the brain remains a crucial bottleneck.
With modern day circumstances dictating that more and more data be pushed across this communications channel to the brain, in seemingly an ever decreasing amount of time, it is not hard to see that a limit will be reached. I think I'm close to my limit already.
It seems that the bandwidth of the channel must be increased. New learning techniques, genetic engineering or opening up as yet undiscovered communication channels may yet help solve the problem.
Or maybe, just maybe, we ought to look at the problem from another perspective and simply reduce the amount of data that we have to cope with.
Perhaps Microsoft is beginning at last to address the problem they played a significant role in creating. With the introduction of Visual Basic for Applications, we have a language that will eventually do away with the need to work with half a dozen different macro languages. Such re-usability of information that has already been sent "across the link" to the brain is to be applauded.
In time, maybe, the technologies involved will merge to such a degree, that we will be using perhaps just one or two major integrated software products running on a common operating system.
Or maybe we need to adopt a client/server approach to learning, where we "learn" only general information, and access the details only as required. Perhaps we can make the "server" use the information for us, by telling it what to do rather than how to do it. Maybe this is exactly what OLE will do for us — encapsulate complexity in simple objects.
I sure hope so, because I'm beginning to feel the need to get my eight hours of sleep every night (which just proves you aren't a real hacker — Ed.).