How to make computer science courses better
As I sit here working on a project in lieu of going to class for the umpteenth time, I realized with sudden clarity that perhaps the current way that Computer Science is taught in universities is not the optimal way. Large lecture halls, clunky languages- it's a bad sign when even a die-hard "learning for learning's sake" student skips class on a regular basis.
Posted by Jim Zellmer at January 1, 2013 2:59 AM
anyway, a few thoughts on how I would change the structure of Computer Science courses to maximize efficiency + interest and minimize pain:
STOP TEACHING JAVA.
Seriously. I have never, ever been required to interview in a specific language at any company (even the "big names"), and if the people you're talking to are deadset on Java, you probably don't want to work there anyway. C is a much, much better "lower" level language for really grasping the way programming works, and Python is a much, much more fun language if you want to lower the barriers to entry and get students making things right away.
Subscribe to this site via RSS/Atom: Newsletter signup | Send us your ideas
This poor student seems quite lost. Not necessarily her fault since there is debate and discussions within organizations like the ACM on what the education in the area should be. I certainly don't have an answer.
There are significant questions being bandied about. It goes something like: What should be learned? Is Computer Science a science? Is there a difference between a computer science education and learning to program? If computer software and computers are to be used as tools of the trade, what should be learned? What existing software systems should be learned as tools of a trade? What trades are relevant?
If the T of STEM is recommended to be learned, in say, middle and high schools, what should be learned?
There are certainly cognitive components that need to be learned that cross all the above questions. Some have called this "process learning". How do you teach this? How do you learn this?
Then, there are so many languages available that can be learned, and each models some simulated worldview that requires thinking differently from one language to another. And each language solves common problems in conceptually different ways.
Then, one must ask, how much of the "magic" of computers and software should be elucidated so it no longer is seen as magic? Shouldn't this be taught to everyone?
One problem I've seen with teaching a language like Java as the first course in CS is that the science piece of CS is often overwhelmed by the thickness of little things and miss teaching and learning the big ideas. The problem is identical to a middle school history course where after learning a little about the civil war, the students are assigned a one week group project to create a poster of some civil war topic. Most of the time creating a poster is spent with layout, pictures, limiting what information is covered, getting meetings together and assigning tasks and determine accountability. Precious little of civil war substance is learned in a poster project; it's a make-work project no different than repeatedly digging a hole just to fill it up again.
Writing a program in Java is like that in that the student will spend an inordinate amount of time doing overhead tasks like using an computer, using an editor, debugging arcane language syntax, trying to understand arcane stack dumps from the Java interpreter. Not useful in the big cognitive picture; but, certainly critical in learning that computer software languages are not forgiving, as at every turn they tell you you have done something wrong.