Sdílet prostřednictvím


Separating TV from Reality

Have you ever watched one of those TV crime shows where they run come computer software to do face recognition or to compare fingerprints? Everyone I see they show the images that are being compared flash across the screen. Wow look at how fast the compares are going! Sigh. Leaving aside for the moment that facial recognition software is no where near as good as the TV show would have you believe can you imagine writing a program that would need to show the images on the screen to compare them? Of course not.

Displaying the images is a huge waste of computer time. In actual fact a time critical program would never display images until a match was found. Input and output is just too slow to do where it is not necessary. A good programmer might do it during debugging and with a small data set. That would make sense. But for a production program? Not so much.

I was experimenting with the Stopwatch control on a program recently. I was primarily interested in the speed of the calculations the program was doing so at one point I commented out a line that printed results. One statement that called to ToString method and added an item to a list box was removed. Oh and this line was only executed one out of about every 333 times the main loop was executed. Commenting out that one output statement speeded up the program by approximately 10%. Compared to all the addition, multiplication and raising numbers to the fourth power that the program was doing this display appears trivial. But anytime the computer has to go to a screen or a keyboard or human computer interface device things are going to slow down.

In programs we assign students we often see a lot of extra IO. Sometimes we ask for it because we want to see what the program is doing. Other times we really don’t care because the program is going to complete in a second or less. I think though that we need to discuss this issue with students. I think that it is important that they understand optimization at some level. We should discuss the speed of I/O devices for example. Why is memory faster than cache? Why is cache faster than disks? Why do we want to avoid extra I/O to screens or other devices? I don’t think we have to get hung up on it but it should be talked about.

We talk about Big-O notation and using it to look at algorithm complexity. I think that is a good thing. A great thing in fact. We can probably wait until college to get serious about instruction execution speed. That’s very hardware dependent of course. But would it hurt to tell them that a multiplication is several times as time consuming as an addition? What’s the fastest way to do 10 * 2? Is it 10 + 10 or 10 *2 or is there an even faster way? (Shift left anyone? :-) ) The problem of course is that so much of the discussion will not seem real to students until later in life. Only then are they likely to see a real need for what we used to call bit fiddling. We tend to avoid the hardware in computer classes. Its all about the software and concepts in the abstract. It think that we do students a disservice by completely ignoring the hardware. Especially in high school I believe that teaching something about the hardware might inspire some students to look into computer engineering. Maybe one of them will work on the big problems of speeding information in to and out of computers. Or improves the state of micro hardware and micro code. We do need to hardware to run the software after all.

BTW You’ve probably also seen someone delete a file on a computer on TV and watched as either a progress bar is displayed, or my favorite, an image gradually gets pixilated and falls apart on the screen. I always wonder that operating system does that. The reason they do this on TV is that it makes for a good visual which is about all that matters on TV.

Sometimes I think it would be fun to be a programmer whose job it was to write software that fakes (or should I be more polite and say simulates) some computer activity on the screen for movie and TV. I wonder if there is a project in that? Put up an image and then randomly blank pixels until they are all black and then close the image. Could be fun.

Comments

  • Anonymous
    December 05, 2008
    Why don't you take inspiration from the demoscene, where every machine cycle MATTERS? The examples could be REALLY short, but could give instant gratification. my 2c ;)

  • Anonymous
    December 09, 2008
    The comment has been removed