The Beginner's Crash Course on Computer Programming

97

Comments

+ Add a Comment
avatar

AFDozerman

You aren't a real programmer until you master BrainFuck:

http://en.m.wikipedia.org/wiki/Brainfuck

avatar

LatiosXT

Whitespace is the master race.

avatar

H1N1theI

You aren't a man until you've INTERCAL'd.

avatar

trgz

Sound idea for an article but perhaps you ought to have used Pseudocode rather than what might at first appear to be 'mumbo-jumbo', to the uninitiated, for your examples? (says the guy whose had a go at coding in C#, VB, Java, Javascript, Python, SQL, ASP, Forth, Assembly language and Z80 machine code and maybe some others). It's far less intimidating and (apparntly) even has a practical use once you can code.

avatar

Shagwell73

You guys made my brain ache....can I just see some spinny fans and flashy lights please?!?!?!

avatar

maleficarus™

Well, I'm a hardware guy and not a software guy so this is all Mars to me. But man you guys really do sound like a bunch of geeks! LOL Kidding aside, for me the excitement has always been the hardware. I love the smell of silicon. I love playing around with circuit boards and trying to figure out why something works the way it does. I absolutely love trouble shooting problems as I crave the challenge. The software side of it I find boring and dull. Looking at a bunch of code on your monitor has as much appeal to me as watching snow melt in spring...

The idea of making a programme is tempting to me but I know myself and I would lose interest after a day or two...

avatar

goku_dsv

I'm with you on that, I've programmed in C++, Basic, Java, SQL and Assembly but never quite enjoyed it as much as I enjoy working with hardware. And face it guys if you asked a woman which word she'd prefer software or hardware, which do you think she'd pick more often? Lol case closed!

avatar

AFDozerman

You consider troubleshooting fun? Maybe you should help me out with my freakin 290x, then. This thing has been a pain in the ass for weeks now.

avatar

maleficarus™

Fun unless it is my PC LOL

Actually my PC talks to me with a HAL9000 sound theme. My real name is Dave so it works well. My GF started laughing when I boot up my PC and my PC says "good moring Dave"

avatar

AFDozerman

I wish I had a significant other that even knew what 2001: A Space Oddesy was. Kinda feels bad to be the nerd of the relationship... :I

avatar

Morete

No love for Delphi XE5? With the starter edition for only $200, you can't go wrong. Remember, the best things in life don't always come free.

avatar

Chris04920

This statement on page 2 is not correct.
"Some languages, such as Java, are not compiled, but rather interpreted, which means that the source code itself is distributed, and then compiled on the end user’s machine."

Java is compiled down to byte code, then interpreted by a JVM when executed. No source code needs to be distributed for Java to work.

Also there is no such thing as a "Java-Script applet" there is a Java Applet, which is Java and has nothing to do with Javascript.

This article is not very accurate at all.

avatar

LatiosXT

The definition of an interpreted language is that you can essentially run the source code directly. Java still fits this regardless. If you have JVM installed, you can run a Java source file more or less directly. If you have a C source file, you can't find anything that'll run it directly (though it's likely http://codepad.org/ "interprets" C, but it's more used as a scratch pad than a real programming interface). It has to be compiled, assembled, and linked.

The whole bit about "compiling into bytecode" is an interpreted language schema to help with performance. I don't believe there's a programming language in wide usage today that is directly interpreted. It's mostly scripting languages that are.

avatar

LatiosXT

I kind of argue against that C and C++ are "low level languages". Sure, they're used for system level software and allow more direct access to hardware (at least in the form of pointers), but to me a low level language is machine-specific, like assembly or machine code. C and C++ can work on any machine provided the compile and appropriate libraries are supplied to it. The only reason why C/C++ has great performance is because it's been standardized for years (I mean, it has an ANSI and ISO standard), and compilers for it have gotten really good.

avatar

MaximumMike

C and C++ are third generation languages, making them high level languages. But they demonstrate a level of power and control not found in other 3GL's like say COBOL, or even Java for that matter. So, they are often misconstrued as lower leveled languages, when they in fact are not.

Perl, Python, Ruby, PL/SQL, and PowerBuilder are some examples of popular 4GL's. In comparison to writing code in those languages, C++ can certainly seem like a low level language. But C++'s level of abstraction from the physical hardware is mathematically too far to be considered a low level language.

avatar

Hey.That_Dude

You can perform bit wise operations and talk directly to memory address with C and C++... I don't know how much lower you want to get before you're in assembly.

avatar

MaximumMike

Being the most powerful 3GL I know of, you can't get any closer to assembly than c++. But that doesn't change the fact that c++ is a 3GL (high level language) and Assembly is a 2GL (low level language). This is terminology defined by some very intelligent computer scientists before most of us started programming. It's concrete, unlike the technical jargon Apple's marketing department likes to play with (3G and 4G cellular networks come to mind). A good bit of literature has been written on the differences between language generations. Anyone who loves math, science, and computers should find it interesting.

avatar

LatiosXT

I feel like the only thing that really separates 3GL and 4GL ultimately, is how you treat data (and possibly how easy it is to get started, as I'm starting to see the charms of Python).

I feel like whenever I work in C, I treat data as something in memory. That I can manipulate it very easily as long as I treat it with care. There's a floating point number that needs to be transferred out of a serial port? Cast it as a byte pointer and put each byte into the buffer.

If I want to do the same thing in say C#, I can't. Data in this case is treated more like an immutable object of sorts. I have to convert it into another object that I can use to stuff a serial port buffer with, because the buffer only understands byte-array objects.

avatar

goku_dsv

Sometimes you have to treat data like a psycho ex-gf: return it to HEAP and free a block of your memory (your sanity =))

avatar

H1N1theI

Uh, I think C's type system is more of paradigm than generation. I mean, Ada has some serious memory BS going around... And personally, I don't like looser-typed systems, I welcome our new static_cast/dynamic_cast/reinterpret_cast overlords.

Also, if you want a kick, the C standards don't define char as a byte. Just that size_of(char) = 1, and also that size_of(short) < size_of(int) < size_of(long). So I could make a standards compliant compiler with a short size of a KiB, int size of KiB + 1 Byte, and a long size of 1 TiB. :U

avatar

MaximumMike

You're not far off there. Stuff like this is why C++ code doesn't port well from one environment to another, like say from Linux to Windows. Not every compiler treats data types exactly the same way. And there are good arguments for this to be the case. Linux and Windows programmers are often critical of one another because neither one considers that someone might actually try to compile their c++ code on the other platform. But neither one considers there are other smaller platforms that also run C or C++ that don't have the abundance of memory resources of a PC. This was especially the case when the original C++ standard was written, and even more so when C was standardized. So, it make sense in the standard that if someone is writing a compiler for another platform they might not choose the same sizes for their data structures. Enforcing that the basic relationships between the datatypes remains the same without defining the actual sizes allows for both flexibility and data integrity. It's not perfect, but considering the vast number of computing architectures in existence, it never will be.

That's why it's important to know both your environment and your compiler. One of the worst assumptions made by any programmer is that his code will run somewhere else. The most portable code is usually the most costly and slowest code, altogether poor code notwithstanding. The most efficient code is usually barely portable at all. It all boils down to the fact that machine language is different on different architectures.

There is no way to say something once and immediately have everyone in the world understand it. You either have to say it in every language or you have to have an interpreter for every language. Computers are the same way.

avatar

LatiosXT

But then you'll be lynched for it. :3

Also I feel any C programmer worth his/her salt, especially in embedded situations, uses stdint.h fixed width definitions for variable declarations. This takes care of "how much space does your int take up?" problem.

avatar

Hey.That_Dude

Any embedded programmer worth their salt doesn't use "int". PERIOD. You use char, short, and long. There is far to much ambiguity to how the code is read, even with that library definition (which is an extra link assignment and more space for something as simple as just saying what you mean the first time).

avatar

H1N1theI

Yeaaaah, I have a problem of not caring, mostly because of C++/Java/Desktop development.

I should fix that at some point.

Also, I demand that everyone uses __char8. :P

avatar

LatiosXT

I can't tell if you're serious or not xP (you're not allowed to use underscores as starting characters because these are reserved for compiler intrinsics).

Also I've managed to have portable code while minimizing the horrors of porting. A project at work at some point switched MCUs (only because we thought we were getting the one feature we didn't have). All I had to do was build the board support package (BSP) file for that part. The rest of the code plugged right in just fine.

Abstracting your low-level functions will go a long way...

avatar

H1N1theI

MSVC compiles with __char8 as a type. MSVC also accepts putting ClassName::FunctionName as an valid declaration in the header files. It also changes how they name mangle every generation...

How I hate MSVC.

From desktop development, I haven't had any issues porting yet. My code compiles fine so far on Cygwin, Mingw, and native G++, so I think I'm fine there. Linking is another story though. XD

avatar

ShyLinuxGuy

Initially, I wanted to code, but I don't think I am creative enough to pull it off. I would say coding should be an art--I believe creativity is a crucial have-to-have component in computer programming. I know the concepts of C++, Java, etc., it's just that I'm not "good" enough to do that as a full-time job.

On that note, because I am not creative in that sense, I went into systems administration. I code to the extent of occasionally writing a script, both under Windows Server and the Linux shell, and for a few things on the client side (i.e., automating tasks or deploying a feature or app). I do use the command line on both platforms extensively, though, but that's as close as I get to coding and that's *not* coding =).

avatar

wolfing

the trick to programming is to break problems into parts (basically what an algorithm does).

Many times you have a problem and you have no idea how to tackle it. Simple, break it down into smaller problems. If you still don't know how to tackle these, then break these into even smaller problems. Eventually you'll get to things that you can grasp and resolve.

Another trick is to have some results early. Some people can work it all out of their brain, but some others (like me) need to see something happening. Maybe as simple as "here would go the list of products", so you can see your code do something incrementally as you finish different parts, if that makes sense.

avatar

MaximumMike

The real secret to coding lies somewhere between mathematics and dry literature with only enough artistry to keep the programmers from beating their heads on the walls of their dungeons.

avatar

trgz

For me, the secret to coding/programming/automation, is the sheer boredom of doing repetitive work and 'knowing' there must be a better way armed with whatever tools there are to hand - it's fundamentally lazyness combined with an opportunity to play and create.

avatar

nadako

The for loop and while loop makes my head hurt a little while looking at it. what it should look like is.
while(x < 10)
{
++x;
print(“hello”);
}

and you code will not run correctly if you do

for(int x; x < 10; x = x + 1)
{
print(“hello”);
}

x is undefined.

for(int x = 0; x < 10; ++x)
{
print(“hello”);
}

avatar

MaximumMike

Really, I think it's just supposed to be pseudocode. For and while loops exist in most languages, not just C++. Most newbies will understand what X = X + 1 means, but anyone who understands x++ or ++x likely has already mastered enough of C++ that they wouldn't be interested in the section on loops anyway. But if you want to be anal about syntax, they should be using cout as well.

avatar

theotherguy

After 4 years of college programming, the course flow at the college level is normally; introduction to concepts, basic programing, some HTML, more basic programming, then C# a side of SQL, and then Java/PHP followed by asp.net web programming. Visual Basic is taught as an entry level business appication developing language. C# is taught on top of your vb experience.
Java and PHP builds on your understanding of C#, which after the C# class you think can do it all. PHP gives you a glimmer in your eye about building 600 forms on the internet for data entry. Then along comes asp.net just when you realize that PHP cann't do it all.
HTML is a language (HyperText Markup Language) yes the primary job of the langauage is to allow people to correctly present things on a webpage, but it is a langauge. CSS or cascading style sheets are used now to provide the formating of the page. Adding Java to this enables interactivity of the user and the pages.
My honest thoughts are that HTML/PHP & JAVA can really be taught at the same time (including by default CSS).
If one wants to start out contructing things all at once the individual should look into Visual Studio to start off.

This is my two cents, and differs from other people experiences, I don't know your hardships I only have my own. I welcome people to advise on their experiences, but do so as if I had a baseball bat and was right behind you...

avatar

MaximumMike

Maybe where you went to school. But where I went to school computer science taught very little programming beyond the basics and left it up to the student to figure out the rest in whatever language he liked. I'll never forget the time in my senior year when one of my classmates asked me if I thought it would be a good idea if he learned how to program. I almost fell out when he said that. Then I thanked God that I had learned to program in a technical school and not at the University.

At the university CS curriculum was mostly theory and mathematics. Most professors didn't really care what language you completed your assignments in as long as you had a compiler installed and could show that your code worked. When I graduated in CS, I was 3 classes away from a double major in Mathematics and 5 classes away from a triple in Physics.

At the technical school it was all assembler, c++, cobol, jcl, cics, and linux script... with a side of VB and Perl just for kicks.

I think you'll find most people's experience varies from yours as much as mine does. Programming curriculum are all over the place depending on where you go.

avatar

theotherguy

My ITEC degree program is for my BS in IT Management. The core classes are the standard History of Computers, HCI, SAD, IT in business decisions, project management, and the greatest ethics in computers. In the upper level, I took both software development and IT management, database programming, web development for business, C++, VB for client servers, and database securities, on the management side which covers decisions, business tech, electronic commerce, and globalization. plus others. During my associates degree, my random electives were also in SQL, HTML and VB it worked these classes in to avoid a delay in getting more advanced level classes. Once I transferred from community college to a four year, I was already taking junior and senior classes to complete the degree programs.
Similarly to your experience, I work in a company that has an in-house programming team, I often had asked them for advice on what classes should I take, and early on they did assure me that they did not take classes that I have a choice to take.

avatar

cstmtrk9706

When i first learned about computers in '76 for the Army, we could only work in Hexadecimal.. Now thats lots of fun.

avatar

timmyw

You got to use hexadecimal? Wow! High-tech. We had binary switches. You haven't had fun till you programmed a machine where your only interface is a series of switches.

01101000 01100001 01101110 01100100 00100000 01101100 01101111 01100001 01100100 01101001 01101110 01100111 00100000 01100001 00100000 01100010 01101111 01101111 01110100 01110011 01110100 01110010 01100001 01110000 00100000 01110010 01101111 01110101 01110100 01101001 01101110 01100101

For the time it was cutting edge. 4KB of magnetic core memory! Once you got the boot loader programmed by hand and you could use punch tape to load the rest. I think that was at the blazing speed of 300 characters a minute!

Good times.

avatar

MaximumMike

I am not envious. I have heard the horror stories of exploding tape drives and walking hard drives. I am honestly glad I did not start programming in that era. Learning JCL was aggravating enough, I do not care to program in binary.

avatar

Hey.That_Dude

Meh, that's not that bad. He used 8 bit instructions. Try doing 32 bit instructions in pure binary. That is hell.
Although, that Mag Torroid Mem. does make me cringe.

avatar

wolfing

I find game programming a bit like professional sports. When you're a teenager you dream of making the best game ever, truly revolutionary, and learn low level programming, 3d graphics, all that jazz. Then reality bites you in the ass and you end up learning SQL and C# which is what will give you a steady salary.

avatar

thematejka

"it makes it possible for you to truly understand what’s going on underneath your desktop."

I remember getting this feeling when I began getting comfortable with programming. However, I must object to the above quote. Once you get comfortable with programming you will be wondering: "but how does the hardware work?"

You have three levels or so:
1) people who can manipulate the front-end (stuff you can see).
2) people who can manipulate the back-end (programming).
3) and finally, those who build and program the hardware itself.

You do not really understand what is going on until you get to the very bottom.

-----

When I began programming, I started with BASIC, moved to Java, then entered the word of C/C#/C++. I moved this way because you start with intuitive languages and end with more esoteric ones. "Intuitiveness" is basically a non-issue for me now, but BASIC was almost like writing English.

avatar

H1N1theI

I object (heh). One shouldn't be concerned about how the "hardware works" if you are starting programming. I barely understand assembly at all, but I still understand call conventions and basic instructions (it does come in handy when I'm doing certain things involving linking and scripting BS). Having a general idea does help, but knowing every single logic gate in a CPU does you no good. It's simply too far low level to consider.

Also, BASIC is one hell of a drug.

avatar

dgrmouse

If it weren't for the contradictions in your argument, you'd be proving the other guy's point; eg, you can't understand calling conventions unless you understand assembly (Which register holds the return values? Which values are passed in registers and which on the stack? Who is responsible for the stack frame?).

avatar

H1N1theI

I never said I didn't understand it at all. Push to eax to in reverse order.

I just said that I don't know much. Also, I disagree with the idea that one doesn't understand what's happening if you don't know the atomic (heh) components of something. Abstract-ions (I should stop) are there for a reason. I view ASM and the physical gates too far removed from the high-level langs used today.

avatar

dgrmouse

Sure, if you say so.

I promise you that the guy who knows assembler is going to, for example, have an easier time understanding things like C's ability for functions to have a variable number of arguments. It's not a failing of the abstraction, it's just that a little a priori knowledge can go a long way.

avatar

H1N1theI

Yes, I'm not disagreeing with you there, but I don't view it necessary for "understanding of what code does" in the design way, but it's better for understand what the code does and why it does it that way.

Also, sorry for the late reply, I was away for a few days.

avatar

iheartpcs

10 a$="My Uncle made $53627 using his computer at home"
20 open "maximumpc.com",8,1
30 goto comments
40 print a$
50 goto 40

avatar

H1N1theI

Heretic! Your use of basic shall not be tolerated!

Every1 no tat al te 1337 hax0r @ teh internewt us perl cus kool.

avatar

stradric

Good article, except the "What makes one programming language different from another?" section is mostly inaccurate. C and C++ *are* high level languages like Java. Python is an interpreted scripting language.

Language features of C mean that the programmer has more control over how memory is allocated and what not. But that in and of itself does not make it a low level language.

C and C++ applications will beat out Java or Python in performance simply because they have less complexity in execution. They compile directly to assembly while Java compiles to Java byte code and runs in the JVM. Python is interpreted by the Python interpreter. So the latter 2 languages require multiple additional layers of abstraction that introduce some overhead and thus reduce performance.

Assembly is a low level language that uses instructions specific to the architecture and CPU.

avatar

QuantumCD

C/C++ are *generally* considered lower level languages than Java, C#, etc. because you (can) work directly with the memory in the form of raw pointers, etc. Meaning that if you don't understand memory/pointers/references, you will probably screw something up without a GC.