Programming C, C++, Java, PHP, Ruby, Turing, VB
Computer Science Canada 
Programming C, C++, Java, PHP, Ruby, Turing, VB  

Username:   Password: 
 Information request; Programming Languages
Index -> General Discussion
View previous topic Printable versionDownload TopicSubscribe to this topicPrivate MessagesRefresh page View next topic
Author Message

PostPosted: Mon Mar 04, 2013 1:48 am   Post subject: Information request; Programming Languages


Old guy; looking into some trends, and I'd like confirmation.

Seems that Universities are using, (I'm not going to say teaching, as the language itself is unlikely a core topic of lectures) with a disclaimer that I'm only scanning for mainstream usage patterns (in the main, the common denominator that I could google).

x86 Assembler

Is this wrong, so far off base as in need of correction? Can I go about the intertubes spewing this information without it being misinformation?


PostPosted: Mon Mar 04, 2013 3:25 am   Post subject: RE:Information request; Programming Languages

I've seen MIPS used more than x86 for assembler, though they'll probably both show up in some form. Also I think it's common to see something like ML or Haskell in a functional programming course. I've also seen Prolog used in AI and programming languages courses.

But really... If I were to only go for the most common languages, I'd probably just use your list with maybe MIPS/x86 assembler listed instead of just x86.

PostPosted: Mon Mar 04, 2013 11:34 am   Post subject: RE:Information request; Programming Languages

Based on my extensive time at UW, I would say: C++, Java, Scheme, and MIPS for assembly-level stuff.

Notable Exceptions:
- CS442 Programming Languages course uses some more exotic languages (supposedly "Lisp, Prolog, ML, Ada, Smalltalk, Icon, APL, and Lucid").
- CS486 Artificial Intelligence let you use "any language you want".
- CS450 Computer Architecture uses Verilog and seems to be targeting a custom ISA (similar to MIPS)
- CS343 Parallel and Concurrent uses a specific modification of C++ called "micro C++", uC++.

I haven't seen any courses that use Python specifically, though some (like AI) allow it. Most of the courses that allow C also allow C++, with the exception of CS240 Data Structures and Data Management, which may well have changed since I took it 5+ years ago. However, they frequently also require that you compile with outdated (sometimes ancient) compilers, so so far I haven't gotten to use any C++11 features.

PostPosted: Mon Mar 04, 2013 7:41 pm   Post subject: RE:Information request; Programming Languages

Hmm I think SE 101 started using python just last fall for the SE class of 2017 but I'm not totally sure on that one.

Also new (dunno SE or maybe all engineering had some curriculum changes this year) is the ECE 222 course, which is now teaching ARM instead of some Motorola one (not sure which though).

CS 241 also uses mips.

PostPosted: Tue Mar 05, 2013 2:19 am   Post subject: Re: Information request; Programming Languages

Thanks for the facts folks.

And just for a little back fill. When I started at Waterloo in 1982, we were using Pascal (programming integral solutions), COBOL (straight up transaction processing and page break problems), Second year was 6502 assembler, and Fortran (again, leaning towards programming higher level mathematics).

C was still pretty new, the white book only a few years old, and the mainframe was not really it's target audience. The spanky new personal computers seemed to like it though. ALL our labs where mainframe style timeshare logins on 3270 terminals. Even the 6502 assembler lessons.

Being old now, I think it's pretty safe saying that you will all grow older and still hear about C (or C++47 etc), and COBOL will still be floating unless someone designs the Better Integrated Business Language Environment. I have no feel where Java may end up, it's history sadly spoiled by politics and power struggles. I probably won't care, unless the head holding cryogenifridge is acting up with unhandled exceptions.

One caveat on libc. libc and the entire C application binary interface stands to take some very bad press with the epoch rollover in 25 years. COBOL took that heat with y2k, and that was only paper reports and some easily tracked wonky interest calculations. COBOL is still in use, but if there had been an alternative, corporations would have bailed. C running embedded come the epoch rollover will be a lot harder on society than the y2k issue. Or there will be a cost to the big players that will trigger another 'dotcom' bust.

[old guy rambling now, sorry - and we wore onions on our belts]

My take on y2k and the dotcom bust: Through computing history, a business player (and these were big in the early days, akin to particle accelerators, not everyone needed one) could spend a million dollars and save 50 million three years later in paper entry clerks and auditors. So nerds got handed research dollars, pretty much as much as they wanted. Spend million, save 10. Then we get to PCs and ubiquitous computing. Now a large business was getting into the spend a million, save only two, years. But the internet started up, gave people a reason to want to have a machine capable of entertainment. Boom time for the web, and e-commerce. Everyone was still in the head space of spend a million save two, but now, spend a million make two. It might have continued. Enter 1998-99 and serious realizations of fiduciary responsibility and very big players on the tag for pocket money if things go bad. 98, 99 development on large systems nearly halted, with a back audit pass of the code base. 2000, and big players look at their books. "Let me see here, my 48 billion dollar tech investment..." (usually good for a return bumping into 100 billion) "... COST ME WHAT? What did I get for this? NOTHING? What?" "Screw those computer guys, where did all my money go?" Fun money dried up. The web had to implode. Most business models vapourized, but a few had struck gold, so the web stayed.

[old guy walking off to the next town to get a better belt onion, the yellow ones that they only get over Merickville way, and then maybe we'll stop for an egg. We had eggs back then ya know, and sometimes the eggs were hardboiled and sometimes scramb....]

Anyway, sorry, and thank you for the current events update. Seems like the short list is fairly accurate (with the edge cases that make programming interesting of course). I'll be able to spew opinion and sound like I still know that yellow onions make the best belt onions.

Oh, crap, I have another question now.

Which interface? Shell, IDE, both? Do professors care?


PostPosted: Tue Mar 05, 2013 3:19 pm   Post subject: Re: Information request; Programming Languages

I'm only in 2B, so what I say might not be true in upper years.

For CS135 and CS136 we were instructed to use the Racket IDE for scheme and for C we used something called RunC and gedit (although I just used gcc and vim and never had any problems). Then in CS246 we were introduced to the bash shell and encouraged to use either vim or emacs over ssh (vim > emacs), although pico and nano were also available. After that professors haven't seemed to care about our development environments or editors, but using programs on our student environments over ssh has been required.

PostPosted: Tue Mar 05, 2013 5:51 pm   Post subject: RE:Information request; Programming Languages

After the very first CS class (CS134 in my case, which was using Java, about 7 years ago), they don't care. In that first class they suggest a specific IDE but frankly it's not very good (Dr Java) so a lot of students used everything else. Marks were unaffected.

Most classes just say "it has to compile when I type 'make' at the command line". Even classes that use specifically Java (why would you ever build Java with make?).

Some classes, most recently my CS456 class, provide native command-line executables that you're supposed to use when writing your assignment, which more-or-less require that you be on their Linux systems or a compatible distribution (I think most of the Debian-based ones are fine, but don't know for sure). They have a "cluster" of systems that you can log into through a single URL (

In a few classes, they made use of a system helpfully named "Marmoset". Rather than being a silly-looking monkey, this was some online system that could automatically build and mark your assignments "quickly", meaning you had a turn-around time of maybe 5-30 minutes instead of "no turn around, we hand your marked assignments back weeks later". Students were encouraged to "submit early, submit often", then take into account the errors and failed test cases to improve their programs. I loved it, but I haven't heard anything about it in years (since I took CS241, Foundations of Sequential Programs, aka "baby compilers", back in probably 2008 or 2009).

Marmoset was probably not only the best teaching tool UW has ever had, but also their best-implemented computer system.
Display posts from previous:   
   Index -> General Discussion
View previous topic Tell A FriendPrintable versionDownload TopicSubscribe to this topicPrivate MessagesRefresh page View next topic

Page 1 of 1  [ 7 Posts ]
Jump to: