> I'm just frustrated by the macro-trend. Let us project out another > generation, when you 20-somethings know everything I think you should, > and the latest and greatest Something-Oriented Programming is the > current trend. That generation (the ones being born now), won't > know the basics of even OOP, much less even simpler things like > Boolean Algebra, powers of 2, or binary math, and of course, buffer > overflows and how to prevent them. Hell, they probably won't even > know what a keyboard is used for. At what point will four years of > study be not enough? I fear we are approaching that point now, and > the trend is only going to get worse. The curriclum I was teaching > was designed for High School students, so we are already getting > to the point that we are turning High Schools into tech schools so > the students can complete 5-6 years of education by the time they > graduate the 4 year college. I doubt 4 years ever really was "enough". Von Neuman and Turing were hardly 4 year wonders. Industry needs have driven the lack of training in things like binary math, buffer overflows and so on. But even that needs to be taken in context. In a CS degree you will get that as *theory*. This is part of why CS is not always apropriate. They "waste" time learning about binary math, unsolvability, Turing machines, etc. They basically learn one language, and are expected to learn others on their own, and so on. CIS takes a very different tack. They assume that you can't learn languages on your own and "waste" time teaching you things like RPG (@#$% makes COBOL look cool). To my way of thinking you can finish a CIS program and learn nothing at all of value, except maybe accounting. At least you finish CS with esoteric knowledge. (What surprises me is that no one has mentioned abstract algebra, graph theory, and numerical methods as CS requirements.) === Anyway, particularly in CIS the emphasis is on *productivity*. The entire idea is to insulate the McProgrammer from the crazy mathematician who has to program in machine code. CIS students aren't taught about buffer overflow because they aren't supposed to be working it a level or with a language that lets them cause buffer overflows. Someone with an MS in comp sci was supposed to hand them debugged tools. >From a business and engineering perspective programming is bad because it is expensive. And it is expensive because it starts off hard and gets more difficult from there. > Will we be like Psychologists, where you have to have a PhD before > you can get a job that pays more than $20K/year? I wouldn't worry about this. People are attracted to Psychology because its fun, prestigeous, and once paid well. It used to be hard to get into a program, but private, for-profit programs have prolifierated to meet demand for Psych D.'s and flooded the market. (Ph.D.'s are still moderately difficult to earn.) If you look at math Ph.D.'s there aren't so many because the pay usually ain't great and few people think math is fun. Besides any math program is still elitist. I don't see people flocking to computers because they love them. If the financial incentive isn't there many of us would look elsewhere. (Like me, I have an MA in Anthropology. I'd like to do behavioral science, but there's no market. I'd like to do sales, but am socially incompetent and *can't* do that. I'm ok with computers and like the work ok--result, lets try to find any kind of work at all in technology.)