By continuing to use this site, you agree to our use of cookies. Find out more
Forum sponsored by:
Forum sponsored by Forum House Ad Zone

Retro Computing (on Steroids)

Colour Maximite 2 - 3D BASIC Engine

All Topics | Latest Posts

Search for:  in Thread Title in  
Peter G. Shaw28/04/2023 15:59:09
avatar
1531 forum posts
44 photos

I’ve just read through this thread, and very interesting it was too. But what seems to be missing is the “user experience”. It’s all very well going on about the differences between C, C+ & C++, but what about the poor old end user who has to put up with the cockups made by so-called professional programmers (and their advisors!!!). So what about the following?

Let’s start with Dijkstra’s famous dictum "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration." Who the heck is he to make such a wide ranging nasty comment about BASIC programmers? To me it smacks of one-up-manship, or perhaps pure arrogance by suggesting that I, with a smattering of BASIC experience can never become a programmer to his standards. Ok, then what about these examples of so-called professional programming!

Example 1. I was completing an application form – doesn’t matter what for, but there was a drop down list of occupations. So, ok, I chose retired. Two questions further on, and there was a question “Employer’s Business”. Eh! What? I’m retired, hence I don’t have an employer. How difficult is it write the programme such that if my occupation was shown as “retired”, then the question about employers business would be skipped?

Example 2. A different company, not too sure this one wasn’t a jumped up Building Society, but a drop down list of occupations, so I immediately went down about 2/3 and started looking for “Retired”, only to discover that there was no order in the list, none whatsoever. Again, how difficult is it put the list into alphabetic order?

Example 3. My doctor’s this time. In order to request repeat prescriptions via the internet there is an online system. It requires certain information, including mobile ‘phone number. Now, I do have a mobile ‘phone, but it’s for MY purposes, not anyone elses. E.g. I have had a heart attack, so I always take the ‘phone with me when I’m out just in case I need to call for assistance. But, and it’s a big but, I absolutely do not want all the world being able to contact me when I’m away from home – I have a landline for that. So I did not enter the mobile ‘phone number. All was well, I was able to request repeat prescriptions ok. But then, they decided to update/improve security. Fair enough, but I couldn’t comply. Why? Because I had not, and now could not, enter a mobile ‘phone number. And so I had to use the old method of telephoning and leaving a message on the answering machine. Which of course, was quicker, easier, faster, you name it, the ‘phone beat t’internet hands down.

Example 4. My dentist this time. A nice easy method of inputting personal & medical data via t’internet. But one question was “Do I require antibiotic insurance?” Or something similar. Eh! What the h*** is that? Ok, ignore it and carry on. Then insert my DOB, something like 01 Jan 2001. (Ok, not really but it’ll do.) So that was what I entered. Failed – I had to enter Jan 01, 2001. And then it dawned on me – blasted American software! At which point I went to bed. The following day I carried on, but I couldn’t – the system locked me out, so I had to ‘phone the dental surgery to get it re-opened. Fortunately all my previously entered data was still there, but if it had not….?

As a consequence, programmers and their advisors, huh, I’ve absolutely no faith in them.

Now ok, I accept that there are cost implications, but why, oh why do we have to use software that does, or doesn’t do what it supposed to do. How difficult is it to retain at the back of your mind that this is a list, and should be in alphabetic order? Does everyone have a mobile ‘phone? (Answer here is no.) Surely American software should be checked for UK usage?

So, discuss the merits of C, C+, C++ to your heart’s content, but please, please, consider the end user!

Oh, incidently, my programming experience is limited to SC/MP (INS 8060) which taught me about machine code & buses, Z80 machine code, Sinclair Basic for the ZX80, ZX81 & Spectrum, Basic 80 (which we called Mbasic) and Tiny Basic from which I manged to rewrite a Startrek programme for the ZX81 and then the Spectrum. So not really very much. And frankly, at my age, nearly 80, I really can’t be bothered anymore.

Cheers,

Peter G. Shaw

Frances IoM28/04/2023 16:11:24
1395 forum posts
30 photos
there used to be a question often given to those programmers who made too many assumptions about the users forced to use their software 'have you tried to eat your own dog food ?' (bowlderised to meet today's woke snowflakes).
SillyOldDuffer28/04/2023 19:21:49
10668 forum posts
2415 photos
Posted by Frances IoM on 28/04/2023 16:11:24:
there used to be a question often given to those programmers who made too many assumptions about the users forced to use their software 'have you tried to eat your own dog food ?' (bowlderised to meet today's woke snowflakes).

Peter's probably aware the contempt is mutual:

  • The only good user is a dead user
  • Code 18 - the problem is 18" in front of the screen
  • User is spelt luser (with a silent L when one is in the room).
  • Wetware
  • Picnic - Problem in Chair Not in Computer

As for programmers getting the blame for poorly designed systems, they're rarely responsible. They implement whatever is asked for. Or the business buys a package and configures it themselves.

Users are found to have computers full of porn, and - apparently - have no idea who loaded it and the malware. Need I go on...

Dave

PS.

Dijkstra was right about BASIC, COBOL, FORTRAN and most other early computer languages. They were all badly flawed and he deserves full credit for pointing out what was wrong. In consequence, modern computer languages are much better thought out. Modern BASIC is considerably different from the original. There is of course no connection between Dijkstra's insight into computer language design issues and the date format used by Peter's dentist!

Fulmen28/04/2023 19:36:19
avatar
120 forum posts
11 photos

I had a friend who programmed in straight hex. I'm not totally convinced he was human.

Peter G. Shaw28/04/2023 20:31:48
avatar
1531 forum posts
44 photos

Dave,

Unfortunately, my admittedly limited knowledge of programming etc does not insulate me from some of the things I have experienced. And some of that experience involved a specific sysop who was known for trying to prevent people who knew what they were doing, from doing it. This was when I was in fulltime employment as a telephone engineering manager.

In the instance I am thinking of, our group was given one, yes a single, access to a small mainframe computer. (Actually, I don't know what the machine should be called, but hey ho...) Now, in our group there were six of us needed access to this machine at various times. We were all 'phone engineers and we found out that this circuit was a 2 wire circuit, so the obvious answer, to us, was to use a 2 pole x 6 way switch and wire all our individual machines to this switch, and hence to the mainframe, but we weren't sure. Needless to say, the sysop was not really forthcoming so I went to see him, told him to shut up, listen, and answer our/my questions (yes, I pulled rank), eg, we understand it is a 2 wire circuit, yes or no. Is it possible to switch it between 6 pc's using a switch box. It turned out that there was no problem at all with switching the circuit, but he, the sysop would accept no responsibility for lost data due to incorrect switching. Fair enough, but why couldn't he tell us straight away? Needless to say, we wired it up ourselves and it was dead simple: "Anyone using the mainframe?" If no, then "Peter (because it was nearest to me), can you switch it please? ". "Right - done". We never had a problem.

Another problem was that I came across a colleague who was struggling with three databases - he had to shut down his computer to change to another database. Apparently he was told by his support group that it was not possible to merge the three into one! I took a copy of one of the files, and experimented, and discovered that there were two bytes which held the number of records in hex, in the usual method. (I must admit I've forgotten which way round it was, Hi-Lo or Lo-Hi). I then took a copy of another of the databases and managed to successfully merge them. As a result, my colleague and I agreed a day for the merging and in the morning, I copied all three files, merged them, and my colleague then successfully ran the new file. Of course, the original files were kept just in case.

In another instance, we had to use the electronic equivalent of punched cards, each one requiring a specific header appropriate to the exchange concerned. This was on a mainframe elsewhere in the country. I discovered that there was a batch processing system available and managed to write a batch program to create a file and insert the appropriate 1st line data. Unfortunately, I made the grave mistake of leaving my name within the code, and consequently some years later was asked about it. Having forgotten all about it, I denied all knowledge until shown the evidence.

I do understand why some of the comments above have arisen, indeed at one time I was tasked with installing one of the early Windows versions on all of our computers. I was surprised to discover how many were set up as if in America! I also know of a clerical assistant who adamantly refused to set her VDU at the recognised HS&E approved height.

I won't go any further, but as I said, I do understand how some of these comments have arisen, especially having seen the apparent lack of ability displayed by some users. It still doesn't excuse the poor programming I discussed above.

Dijkstra may well have been right, but what little I read suggested he was a high handed twerp who thought he, and he alone knew what he was going on, and to come out with that statement simply shows a deep disdain for other people and a total lack of understanding of them.

Cheers,

Peter G. Shaw

IanT28/04/2023 22:33:46
2147 forum posts
222 photos

Well, I started this thread about the very powerful CMM2 (MMBasic based) 'Retro' computer but it seems we have had a fair amount of topic drift (once again) into the realms of what programming language is best etc etc etc - so I'll throw another two pennies on the fire (again)

The very fast chip that powers the CMM2 is still not currently available, so my industrious fellow 'Mite' enthusiasts have been busy focusing on the RPi Pico chip. As I've mentioned before, you can build a very powerful (and self contained) EMBEDDED computer for just a few pounds with a Pico and MMBasic. Note the 'Embedded' - I'm not trying (and never will try) to build a large database server or simialr and certainly not an AI! Just simple little programmes that can turn things on and off, record data and run motors or servos etc. Very simple but useful things....

For example, recently I've been somewhat distracted with a Gauge '3' version of a L&YR Battery Electric Locomotive that you can 3D print for very little money. In my version (for eldest Granson) it has a PicoMite controlling the two DC motors (via a Kitronics Pico Controller) with an InfraRed (TV) Controller that costs in total about £15. It's only about 12-15 lines of actual code plus some commentary (I'm also looking at a version using an HC-12 link for more range).

PicoMite Controller

So I'm not at all interested in building huge apps, just my simple little programmes where the PicoMite has far more power than I need in practice, is very quick & simple to programme & debug and costs just a few pounds in hardware. I don't want to learn new IDEs or languages (e.g Python). I have everything I need in one place with no external 'libraries' to worry about either!

As for "Retro" (which is where this thread started) I've also now got the parts to build my PicoMite "VGAs", which use the 2nd ARM CPU (& one PIO) to generate the VGA signal, whilst leaving the first CPU free to run MMBasic at full pelt. Again, eldest Grandson is going to be the lucky recipient of one and Grandad is going to use another as a development system (for add-on 'STEM' goodies for him) as well as a Logic Analyser (for me!) when required. The other PCBs will most likely be configured as just 'simple' Picomites (e.g. no VGA) systems. The PCBs shown have two Pico pin compatible 'slots' as well as SD Card and RTC facilities (so are still very useful even without the VGA parts installed). The full PicoMite VGAs are costing about £20 each to build btw, less if no VGA parts are required. I'm also looking at using the latest (PicoW-based) version of the PicoMite (PicoWeb) to set up a monitoring system which can send me an email (when triggered) but I've got some reading to do first.

So a 'Retro' VGA computer if you want one (or alternatively a large control panel display?) as well as a simple and inexpensive embedded controller for anything else you might need. I've tried Arduino but much prefer the Mite for ease of programming and (as importantly) debugging. If you need a simple 'controller' then I'd suggest that the combination of the RPi Pico + MMBasic is very hard to beat.

PicoMite VGA Boards

Got a 'Control' problem? Download the 140+ page PicoMite manual and just take a look. Comms, I/O, File System, Graphics, Sound & Editor. There's just a huge amount of functionality packed into the PicoMite package, all for free plus £3-4 for a Pico to run it on!

You can find more details of the basic PicoMite here: PicoMite and the PicoMiteVGA here: PicoMite VGA

Regards,

IanT

Rooossone19/05/2023 10:14:17
avatar
95 forum posts
50 photos

I am not sure how much use this would be but interesting still for the subject of retro computing. There is a guy called Ben Eater that walks you through making a retro 6502 Breadboard computer.

I'll let you all discover his content without me ruining any description of it...

Ben Eater

He has a lot of youtube content also that goes through the implementation and theory of building a very basic RISC style CPU.

David Taylor08/07/2023 12:38:09
avatar
144 forum posts
39 photos

An interesting thread!

I wouldn't take office at Dijkstra. He knew what he was about but loved trolling.

The chip powering the CMM2 is quite a beast. Putting a quick booting BASIC on there might give some flavour of the retro experience but given it's a system-on-a-chip I feel the fun of learning and being able to understand the computer at a hardware level, like you could an 80s micro, will be missing.

Unfortunately I don't think you can get that experience now - even 8-bit retro kits don't have to cool sound and video ICs we used to have in our 6502/Z80/68000 micros and stop at a serial terminal or perhaps an FPGA VGA generator.

FWIW, I hate Python's significant indentation idea. I also take exception to the idea it's a great beginner's language. It's the most complicated language I know. I like it, and it's powerful, but it's *not* simple. It also seems to still have numerous ways to shoot yourself in the foot left over from when it was one guy's plaything. I've been using it for about 2 years full time and feel I've barely scratched the surface of what it offers.

I'm not sure I agree C is a good language for large code bases, despite the fact there are many large C code bases. Its preprocessor and simplistic include file system would not be tolerated in any modern language - most of which have tried to learn from the pain C inflicts in this regard!

Nigel Graham 208/07/2023 18:30:30
3293 forum posts
112 photos

Above my grade to use, but I am interested to see these little computers and their applications; and impressed by their users' skill with building and programming the systems.

.

When I first joined what employed me till I retired, MS still used DOS, and our laboratory programmes were written on site, by the scientists themselves not some IT department. They used Hewlett-Packard BASIC, to drive various HP and Solartron electronic measuring instruments, printers and plotters.

In my lab, until they started to make the equipment computer-driven, each standard test involved lots and lots and lots of individual readings (about 6 values per iteration for some tens of iterations), writing them on a pro-forma sheet, then typing the numbers into a BBC 'Acorn' with a locally-written BASIC calculator that printed the results on a dot-matrix printer.

With one exception these programmes were all quite easy to use and worked very well - perhaps because they were written by their main users not a separate department, let alone some remote "head-office" wallah or contractor.

The exception was by its writer neglecting to provide an entry line for one, but important, instrument-setting variable (e.g. test volts or frequency), so the user had to edit the internal line instead. That is poor programming, not a criticism of BASIC (or any language).

.

When I bought my first PC, an Amstrad PCW9512, I started teaching myself its own form of BASIC, using a text-book.

I don't think there was anything intrinsically wrong with BASIC apart perhaps from being character-heavy to be readable, but its bad reputation stems as I was told at the time, from it being too easy to fill with thickets of over-nested loops and the like. That is a programming, not programme, problem though; and I gather you can write much better-constructed programmes that that in it.

.

That Amstrad's auxiliaries included the compiler for a horror called "DR- [Digital Research] LOGO" . It was claimed to be designed to help school-children learn something called "Structured Programming" - as clumsy and tautological as calling a house a "Structured Building" . In fact it was so telegrammatic and abbreviated that it was not at all intuitive or easy to read; and despite using a printed manual containing examples such as list-sorters, I failed to make any headway with it.

Around the same time a friend gave me a Sinclair ZX plus a box of its enthusiasts magazines, but that was beyond me! Its version of BASIC was hard enough, but to get the best out of it you also needed understand writing Assembly-language programming.

.

Not sure if it will run on WIN11 but I do have an old compiler for POV-Ray, which uses command-lines to draw pretty pictures. It is not a CAD application but an artistic one, still fun though!

'

I have seen C++ code and managed to see roughly what some of it does, though I not tried to learn it. Once I saw a screenful one of the scientists at work was writing. Among it was a mathematical line followed by a comment:

" ! This is the clever bit. "

Further down, more hard sums were introduced by,

" ! This bit's even cleverer! "

SillyOldDuffer08/07/2023 19:33:04
10668 forum posts
2415 photos
Posted by David Taylor on 08/07/2023 12:38:09:...

FWIW, I hate Python's significant indentation idea. I also take exception to the idea it's a great beginner's language. It's the most complicated language I know. I like it, and it's powerful, but it's *not* simple. It also seems to still have numerous ways to shoot yourself in the foot left over from when it was one guy's plaything. I've been using it for about 2 years full time and feel I've barely scratched the surface of what it offers.

I'm not sure I agree C is a good language for large code bases, despite the fact there are many large C code bases. Its preprocessor and simplistic include file system would not be tolerated in any modern language - most of which have tried to learn from the pain C inflicts in this regard!

Oh, go on then, I'll take the bait! Python may not be simple in the 'Up Jumped Baby Bunny' sense, but I can't think of a real language that's easier to learn, especially good when taken with a small dose of Computer Science. What languages are easier to learn than Python and why? What are these other languages good and bad at?

Hating Python's indentation rule suggests a misunderstanding. All computer programs are more readable when their structure is made clear, and indenting is good for readability. Highlighting the structure is vital as soon as someone else has to read my horrible code, and the concept is so important that Python enforces it. Learning with Python avoids a common bad habit which is writing compressed code that only the author comprehends. And even he can't decode if when he comes back to the mess after a long gap.

How do you 'shoot yourself in the foot' with Python? My toes are intact!

Straightforward for individuals to write short simple programs in most languages, but doing that is a very poor test. The real trouble starts when programs get so big that teams have to manage hundreds of thousands or maybe millions of lines of code. C was designed from the outset to play in that space, which is why it has a complex pre-processor and linkage system. The C environment allows teams to work on complex debug, test, and live versions, and it can also target multiple platforms from the same code base - Window, Linux, Mac, Android and others. C is also very good for microcontrollers and other tiny computers

May not be the best of all possible languages for all time, but so far C has proved a hard act to follow, and in many ways C++ is C on steroids. The two are close relatives.

Tracking the rise and fall of computer languages over time is 'quite interesting'. Hugely popular big hitters like perl, Ruby, PHP, Scala, Rust, Objective-C, Visual Basic, COBOL, Pascal, FORTRAN and BASIC have peaked and waned. A few bombed! Despite many likeable features and having the full support of the US DoD behind it, ADA failed to get traction.

If you want a programming job in 2023, learn C#, Javascript, Java, C/C++, Python, and SQL. C/C++ and Python are predicted to be in hot demand next year, but who knows. Whatever their warts, C and C++ surely deserve the endurance prize - C was much in demand back when COBOL and FORTRAN dominated the industry.

Dave

Dave S08/07/2023 20:02:20
433 forum posts
95 photos
Posted by IanT on 28/04/2023

For example, recently I've been somewhat distracted with a Gauge '3' version of a L&YR Battery Electric Locomotive that you can 3D print for very little money. In my version (for eldest Granson) it has a PicoMite controlling the two DC motors (via a Kitronics Pico Controller) with an InfraRed (TV) Controller that costs in total about £15. It's only about 12-15 lines of actual code plus some commentary (I'm also looking at a version using an HC-12 link for more range).

PicoMite Controller

So I'm not at all interested in building huge apps, just my simple little programmes where the PicoMite has far more power than I need in practice, is very quick & simple to programme & debug and costs just a few pounds in hardware. I don't want to learn new IDEs or languages (e.g Python). I have everything I need in one place with no external 'libraries' to worry about either!

It’s nice to see one of my PCBs in the wild.

Dave

IanT08/07/2023 21:47:43
2147 forum posts
222 photos

Well then Dave, I guess I am a happy customer of yours!

It's a very neat way to drive one or two small DC motors, so thank you.

Regards,

IanT

Bazyle08/07/2023 22:06:58
avatar
6956 forum posts
229 photos

I was talking to an American University professor who mentioned that he and his department still used Fortran programming because it was simple and did the job. They used computers to do calculations not generate pretty pictures and fancy human-machine interfaces.

If Python indentation is an irritation just use Perl. Although not admitted Python is just a mix of Basic and Perl.
Basic has always been good for beginners and children because it is interpreted not complied making for a quick 'change and retest' environment which is what beginners need and children like. 99% of the reason Python has taken off is that it is interpreted so has instant availability.

Weird programmer fact: The female Chinese hotel receptionist in the background in "Die Another Day" when Bond enters soaking wet was our Perl programmer who did some work as an Extra at Elstree Studios.

Edited By Bazyle on 08/07/2023 22:21:43

David Taylor09/07/2023 01:04:37
avatar
144 forum posts
39 photos

I'll start by emphasising I am a fan of Python, despite it having a lot of features I don't like, and I think it's an excellent language for modern programming idioms. I've been programming full time for money since I was 16, 36 years ago. I've written programs in C for Windows 3, devs tools in VB6(!), COBOL on various mini computers, lots of C/C++ on *NIX systems, a lot of Java, a lot of server-side stuff, distributed stuff, remote sensing, and embedded systems. I am hopeless at web programming and hate it with a passion. I'm nowhere near as good as I used to think I was, but I don't think I'm the worst programmer around. My first language was BASIC on the C64 and I don't think it hurt me

I understand structured code, but forcing structure at that level just seems petty and not something a language should be concerned with. If the code 'in the large' is badly structured, some local indenting rules are not going to help. It can be confusing for IDEs because if you're pasting code at the end of an indented block the IDE can't tell if the code should be part of the indented block or part of the next piece of code at the next level out. It also means you can have code in a block that you didn't mean to be there. Delimiters at the start and end of code blocks avoid these problems. It's the same reason I *always* use curly braces in C/C++ code, even for single line blocks - then you don't end up with problems of code being in the wrong block.

Some of the problems with Python:

  • You don't need a 'main' but if you don't have one you might get unexpected results. So they should have just required it. It's nice a beginner can write a one line program to print something on the console but if that causes grief in the majority of cases then it's a dumb idea. Eg, when importing a module, any code outside of a function in that file is executed. I guess you can argue this is good for initialisation code but it just seems wrong to me - it's like a side-effect.
  • Default argument values that are lists or maps are initialised *once*, when the function is first parsed, rather than taking that value every time the function is called. That's insane and completely unexpected behaviour IMO
  • No real threading, and now, worse, the modern curse of 'async' code. As with JavaScript this only came about because the original implementation was as a toy language/quick hack with no thought of support for threads. And now programmers are starting to think this is normal and it's infecting languages that *do* have real threads. If you want a tight select() loop for I/O like in C, then just give support for that.
  • Not requiring types for function arguments and returns. Argument names do *not* sufficiently describe their types, and the language is perfectly happy for you to give *no* indication of what a function returns, if anything. There is *optional* typing but it should be a thing you opt out of, not in to. Note that annotating types for functions the way it is done in Python does not force those types, it just tells the caller what to expect and what will work.

However, Python's strengths outweigh all of that. It has some awesome and powerful features these days, which as I said, I'm only just getting to grips with.

I completely disagree that Python is a child of BASIC and perl - I don't see any inherited traits from either! And no, I will not 'just use perl'. I have never willingly written a perl program, and only debugged them when there was no other choice

Finally, I hate C++! I think it started badly and only ever gets worse. Java was a much better effort at a better C, except it's not much good for low level coding.

So people can more easily tell me I'm wrong, my favourite languages are C, Python, and Java. I think C# has a lot going for it too and would probably choose it over Java these days if I knew it better. Swift looks pretty good too.

I would replace C in a second with a better low level language. Rust, zig, and mojo are all languages I'd be happy to take on if I was paid to learn and use them.

C was not designed for large code bases, it was designed more like a very fancy macro-assembler. It has innumerable faults that have cost untold amounts of money. It's linkage system is painful and you get two choices - the symbol is global (even if not accessible, it still causes name clashes) or it's private to that compilation unit. The preprocessor is not a feature, it's a bug and is increasingly seen as such. C is actually a pretty nasty language where you need to be exceedingly careful not to stuff up but sometimes it's the only thing that will do the job. Luckily, nearly 50 years of C has exposed it's faults and allows newer languages to try to avoid them. Rust is catching up quickly, even for embedded programming. I don't know what caused the great C explosion in the 80s, or why something more suited to high level programming didn't take off instead. But for code heads like me, C still has a weird attraction.

For the recreational coders out there, the annual Advent of Code challenge is a great way to get into a new language. We used it at work to all get into something different and as a good morning conversation topic.

Robert Atkinson 209/07/2023 08:08:41
avatar
1891 forum posts
37 photos

Why do people keep saying BASIC is interpreted? While many BASIC implementations do use interpretors compliers are available and some versions are solely compiled. And if interpreted languages are so bad why is Python so good? Pyton needs huge resources. OK silicon is cheap now but more still costs more and my be less reliable. Even micropython needs a processor with 256kB of program memory and 16kB of RAM. I've written functional code used in professional commercial products in BASIC (Pic Basic Pro) that ran on 256 words (12 bit) of program space and 25 bytes of RAM. Yes, that is 409 Bytes total memory.

Robert

Ady109/07/2023 09:37:48
avatar
6137 forum posts
893 photos

Most of it took ages to learn and forever to program

and if it was easy to learn it ran too slow

It was fun but it was very rarely done properly by "professionals"

The NHS got humped for billions by the computer guys who promised the moon and delivered a horse and cart

At times there seemed to be more lines of legal get out of jail free code than computer code when you looked at a users EULA

The computer industry was the first modern industrial supplier who could deliver a product that didn't actually work and they didn't have to give the money back

It was fun though!

An Other09/07/2023 10:43:34
327 forum posts
1 photos

Just for information:

LINK

Interesting reading the diversion into programming. My first attempts at programming began in around 1965 using Elliott Autocode on Elliott 803 computers - hand-punching paper strip, and later onto stuff like Fortran and Cobol, at the same time going into more exotic machine languages as a result of involvement in research into fully steerable satellite tracking systems.

Working both as a 'programmer' and an 'engineer' over the years (whatever they are), it always struck me that programming languages can never provide everything to everybody. I found I was using BASIC (in various forms) to produce quick and dirty 'programs' or solutions to immediate engineering problems, yet at times getting involved with in-depth programming to produce 'operational' software, and in my opinion they differed tremendously.

I didn't care how it was done in BASIC, and nor did I want to mess about with compiling/linking etc -- the good thing was to be able to write/edit something quickly and see it working - our focus was on the equipment, not the software - it was just a tool. Yet BASIC for all the reasons listed in this post, was hopeless for 'serious' software production.

I suppose we now recognise the shortcomings of all software, but we should remember that the languages we now use were written at least some years ago - times, and equipment change, as does usage - in 1965 we had no idea of the scourge modern hacking would grow into, forcing software programmers (in theory) to use more secure languages (Rust, et. al). C and its variants were considered perfectly safe for use some years ago, security was left to the programmers competence, if it was even considered, but not any more.

Introduction of more efficient and secure languages has brought its own problems. Python was/is often touted as "the" language to use, yet that too has brought along its own problems. As someone already said on this thread, it is horrendous to learn - I believe I read recently there is even a move now to simplify Python, and reduce the use of the multiplicity of tools associated with it that do the same job.

I think software will always carry these problems - and different people will always use what they are familiar with, and like to use.

SillyOldDuffer09/07/2023 13:38:13
10668 forum posts
2415 photos
Posted by Robert Atkinson 2 on 09/07/2023 08:08:41:

Why do people keep saying BASIC is interpreted? ...

And if interpreted languages are so bad why is Python so good?

...

Robert

People usually think BASIC is interpreted because that's what they learned. It's very common, and it was unusual for amateurs to invest in a compiler. If they'd done so, they'd have found compiled BASIC isn't as cuddly as interpreted!

But, the assumption misses an important point, which is computer languages and their implementations aren't the same. Interpreted, compiled, and combinations are all possible, with pros & cons.

Python, like Java and many others, compiles to an intermediate byte code, which is then interpreted. This approach combines many of the advantages of both implementations,:

  • the compile phase error checks and optimises the code - loop unrolling, moving stuff that doesn't change outside the loop, minimising branching, reordering, factoring duplicate code, removing dead code, keeping busy variables in registers etc and many other tricks.
  • The interpreter provides efficient base functionality and memory management. It's fed clean fast byte code, without the complexities implicit to machine code.

Although the approach works well, it means Python is a hefty beast, too big in full form to fit on smaller microcontrollers, not a good choice for embedded computing. However, there are Python compilers than do generate machine code, the main problem being implementations aren't identical.

As with all tools, the best computer language is determined by what it's for. Most languages have glass ceilings that stop the programmer doing what he needs. In my experience:

  • The original interpreted BASIC has a low glass ceiling. Fine and dandy up to a point, then, a shattering stop, maybe forcing the whole program to be rewritten in something else. Compiled BASIC removes some obstacles, but not all, and converting people and code causes £delay. Worse, having to switch to a compiler, suggests a need to think again. Having failed to choose the right tool once, it would be foolish to do it again! Not the sort of language problem the average internet Joe can advise on. If the code is focussed on a system-like problem, then C/C++ is a good choice. But if the code is focussed on efficient number crunching, then maybe the answer is FORTRAN. Back in the day, even horrible old COBOL was massively better at data processing that BASIC. Today there are many alternatives. If it were a road car, original interpreted BASIC would be a Citroen 2CV.
  • C/C++ probably doesn't have a glass ceiling! It's a high-performance system language, close to the machine, used to develop other software tools. Chances are BASIC is written in C, as is much of Python, and many other languages. The downside is a lot to learn and much of the work is a slow low-level plod. The programmer is responsible for almost everything. If it were a road vehicle, C would be a MV Agusta Brutale 1000 Nürburgring with bald tyres.
  • Python has a high-glass ceiling, and is extensible. Although it can do system work, the main focus is productivity. For example, classic BASIC only had two data-structures, strings and arrays. which aren't enough for advanced work. Say a program is needed to count how many times each word occurs in a book. An array can be used to keep tally, but they're fixed size and we don't know how many unique words are in a book. A different data-structure is needed, one that can grow, and BASIC doesn't have one. Python provides sets, dictionaries, deques, counters, maps, lists and tuples. Their availability makes Python highly productive. If Python were a car it would be something like a high-end SUV. Comfy cabin in the front, big carrying capacity, reasonable on and off road performance, winch on the back, and the driver is an ordinary chap with an plain licence.

In the distant past I wrote BASIC for money and it caused many problems. Later, perl was wonderful for several years, but it couldn't keep up with Python. I also wrote C/C++. Now retired, I find Python and C/C++ complement each other delightfully. Python is good for rapid development of big complex general purpose programs that don't need space/time optimisation. C/C++ is excellent when space/time and performance are vital, for operating system level work, and embedded code (Arduino and friends).

Anyone remember Filetab? This is/was a decision table language used for report writing. Back in the 70's it was brilliant, far better than COBOL, apart from its glass ceilings! Really hot at reading files, but not for writing them - this limited what it was useful for. Not good at maths. A more serious shortcoming appeared as the code grew in size. Up to about 2 pages, decision tables sparkle, but after that people start having trouble following the logic. Above a certain level of complexity it was easier to write COBOL, even though COBOL is clunky.

My advice, look for language limitations! Likes are secondary.

Dave

IanT09/07/2023 13:40:17
2147 forum posts
222 photos

ANO - I'm not sure linking two Uno's to provide VGA o/p is the best way to do things. More of a "Look, what I can do" Techie thing. The PicomiteVGA is a far more elegant solution to this requirement, with excellent integration of graphics into the overall system. I haven't looked too much further (into Tiny Basic) but the simple 'Hello World' example uses line numbers and a GOTO statement - so I guess it is certainly "Retro" from that point of view.

Nigel (and your Amstrad memories) - I suspect many still remember Basic as being like that - with line numbers, PEEKS & POKES, nested GOTO loops - so called Spagetti Code. If you really need them, they are still available in MMB but their use is really not recommended. I have access to named Subroutines and Functions, CASE statements and simple access to input/output functionality (no need to manipulate memory to access hardware).

However, I think ease of use is really the best feature of all. When I'm writing programmes and make a typo or mistake (a frequent occurance I'm afraid), when MMB detects my error, it drops me straight into the Editor at the faulty statement so it can be edited. Having made changes, I simply hit 'F3' and MMB saves and runs the programme again. It's very quick and intuitive to use - which for me far outweighs any issues around interpreters, speed etc. Modern micros are very (very) much faster than the 8-bit ones of forty years ago, so processing speed isn't a practical problem in most of the applications I write. Much of the time the system is just sat waiting for something to happen...

Regards,

IanT

Edited By IanT on 09/07/2023 13:41:42

Nigel Graham 209/07/2023 15:55:12
3293 forum posts
112 photos

Ian -

I do not recall ever using the commands PEEK and POKE, but I do remember being told that "Spaghetti Programming" was not by bad language but by programmers' bad use of the language - i.e. poor designing and not using the language to its best.

That suggests that if the language itself does have a weakness, it is in allowing that to happen.

However, and what seems thin in this thread, is that the programming is not done because one has a computer and knows how to programme it, but to perform some intended work. You don't make a model engine because you have a lathe, for example, but you have the lathe because you want to make the engine - but the related parallel is that both need plenty of special skill.

So my superiors did not learn to write in BASIC just to fill up computers (and floppy discs), but to perform real work, and their programmes worked very well whatever the purists might sniff at. I dare say later languages might have been more efficient or whatever, and the users were constrained by the computers available, but that misses the point.

Our workhorse office PCs all had MS Win-x loaded but the lab computers were HP ones without MS software and which matched the measuring analysers made by HP, Solartron, etc, with appropriate data connectors. HP also sold its own version of BASIC for this, using simple command-lines to pass instructions such as frequency range and steps, and receive the results, to and from the analyser.

Our department needed hefty great number-mills the contemporary PCs and anything beginning with 'W' could not handle, so built a small, local network of internal server and PCs., all SUN Microsystems machines. These had their own, stripped version of windows and entailed a lot of DOS-like command-writing to use, let alone programme.

In later years the firm installed circuit-diagrammatic applications like "Labview", but these are not programming languages as such. Rather, they are a sort of laboratory 2D CAD to create a single test and analysis system from the PC and the attached sensors, etc.; and like CAD, do not need users to learn programming.

.

I learnt BASIC to a moderate level (ASCII-value based, string-handling routines were fun!) but could never learn creating and reading data files, despite having a real text-book to use. So my programmes could only print the result of each entry-set one at a time, on screen and paper. I have not attempted C++ although WIN-11 includes it in the applications directory - dangerous if the MS software itself is all in C++? My interest is now simply as observer - I have no engineering purpose for programming.

.

I am intrigued though that even allowing for the development of the electronics (to fill with even more bells and whistles?), there are so many different languages for basically the same circuits.

The computer can only really work in one way, often with circuits common to many makes, so whatever its make, whatever application and language you throw at it still needs translating to what its little blocks of transistors understand. (I don't know the differences between assembling, compiling and interpreting; only that it all needs reducing to 1s and 0s.) It's like having a dozen different books with different styles of drawings on making that same cited model engine, but whatever order you make the parts in, whichever holding methods or tools, the lathe still only works in one way!

So apart from up-dating to exploit electronics development, since any one PC will run various languages within its power if loaded with the right translators, is the Babel-esque range of languages genuinely technical or merely commercial?

All Topics | Latest Posts

Please login to post a reply.

Magazine Locator

Want the latest issue of Model Engineer or Model Engineers' Workshop? Use our magazine locator links to find your nearest stockist!

Find Model Engineer & Model Engineers' Workshop

Sign up to our Newsletter

Sign up to our newsletter and get a free digital issue.

You can unsubscribe at anytime. View our privacy policy at www.mortons.co.uk/privacy

Latest Forum Posts
Support Our Partners
cowells
Sarik
MERIDIENNE EXHIBITIONS LTD
Subscription Offer

Latest "For Sale" Ads
Latest "Wanted" Ads
Get In Touch!

Do you want to contact the Model Engineer and Model Engineers' Workshop team?

You can contact us by phone, mail or email about the magazines including becoming a contributor, submitting reader's letters or making queries about articles. You can also get in touch about this website, advertising or other general issues.

Click THIS LINK for full contact details.

For subscription issues please see THIS LINK.

Digital Back Issues

Social Media online

'Like' us on Facebook
Follow us on Facebook

Follow us on Twitter
 Twitter Logo

Pin us on Pinterest

 

Donate

donate