Jump to content

Any computer wizz's here?


Recommended Posts

13 hours ago, Jerry_Atrick said:

A coder? tsk tsk tsk... I was a software engineer, or developer, or systems programmer - but coder???!! 😉

 

A few on here are, or were.

 

Surpised you don't have COBOL from university in that era...

 

I still dabble - mainly Java  and C/C++ these days...

 

Though will have to learn Rust (not of the kind that corrodes metal!!) as, although it is already over 8 years old, it is slowly going to be overtake C and C++ because it is both a low level language and is memory safe - alledgedly anyway.

 

I meant coder. That's someone who codes or can code independent of university degree. If you're a software engineer I would call you a coder then that should know what he's doing. I had friends that didn't complete formal qualifications before they were hired by Google or other companies because of their open source experience (coders). I like that it sounds like you know what you're doing.

 

I'm interested in knowing more about Rust now. Thanks for that. I'll have a look.

 

Do you do parallel programming? That's like the creme caramel of programming for me. I guess all software engineers do (but not all coders).

  • Like 1
Link to comment
Share on other sites

2 hours ago, pmccarthy said:

Those of us who studied through UNSW around 1970 learned FORTRAN 4 using the Watfor compiler, which I think was created by one of their researchers. We then moved on to BASIC. Fortunately, at the time there were not many languages in use for the non-machine language peasants.

I'm grateful to be here with people that were coding in the seventies. I was only a decade later but I know what it was like. I appreciate the older technologies. 

  • Like 1
Link to comment
Share on other sites

I've forgotten most of what I knew - and I didn't know much. I never had any formal training.

 

I taught myself some simple Basic programming from the courses(?) in computer magazines, with some software on 3.5 inch floppies stuck on the front.

 

I was employed in the data centre of the Bank as a reconciliation clerk, balancing the batches and re-entering rejected vouchers. I moved on to operate an old Burroughs 300 computer which you controlled by entering instructions one character at a time in binary, by pressing buttons on the console.

 

BurroughsB300console.thumb.jpg.600131723782301c65ea08907507df4b.jpg

                    B300 operator console

 

B300Operatorpanel.thumb.jpg.8467fdd651003cedc7f95ed3349425b4.jpg

                    Operator panel

 

BurroughsB300withlabels.thumb.jpg.7257595e70497c761b83753ef7548187.jpg

           A complete B300 system.

 

I progressed to managing a data centre in NSW as a satellite to the main operations centre in Melbourne. After a couple of years, I was transferred to the systems development department. As I had no training or qualifications in this area, I was designated as a systems analyst purely on the basis that the salary was the closest to what I had been on as centre manager.

 

I was given a binder, about 6 inches thick, of a program listing in COBOL. I was told there was a problem with the program and was told to read the code and locate the error. I had access to the programmers when I needed assistance. Basically, that was my training. Later, the bank merged with another, and changed its systems to run on PL/1. I was required to write specifications for merging the systems. In another job a few years later, I had to take over maintenance of a reference system written in HTML for an intranet system for call centre operators. I ended up rewritng the system when the company was sold. That was over 15 years ago, and I haven't touched any programming since. 

 

 

 

  • Like 2
  • Informative 2
Link to comment
Share on other sites

On 17/09/2024 at 10:34 AM, red750 said:

I've forgotten most of what I knew - and I didn't know much. I never had any formal training.

 

I taught myself some simple Basic programming from the courses(?) in computer magazines, with some software on 3.5 inch floppies stuck on the front.

 

I was employed in the data centre of the Bank as a reconciliation clerk, balancing the batches and re-entering rejected vouchers. I moved on to operate an old Burroughs 300 computer which you controlled by entering instructions one character at a time in binary, by pressing buttons on the console.

 

BurroughsB300console.thumb.jpg.600131723782301c65ea08907507df4b.jpg

                    B300 operator console

 

B300Operatorpanel.thumb.jpg.8467fdd651003cedc7f95ed3349425b4.jpg

                    Operator panel

 

BurroughsB300withlabels.thumb.jpg.7257595e70497c761b83753ef7548187.jpg

           A complete B300 system.

 

I progressed to managing a data centre in NSW as a satellite to the main operations centre in Melbourne. After a couple of years, I was transferred to the systems development department. As I had no training or qualifications in this area, I was designated as a systems analyst purely on the basis that the salary was the closest to what I had been on as centre manager.

 

I was given a binder, about 6 inches thick, of a program listing in COBOL. I was told there was a problem with the program and was told to read the code and locate the error. I had access to the programmers when I needed assistance. Basically, that was my training. Later, the bank merged with another, and changed its systems to run on PL/1. I was required to write specifications for merging the systems. In another job a few years later, I had to take over maintenance of a reference system written in HTML for an intranet system for call centre operators. I ended up rewritng the system when the company was sold. That was over 15 years ago, and I haven't touched any programming since. 

 

 

 

Wow. There's a lot of history there. I'm fascinated by the old machines. I liked it when you didn't have to have a degree for everything too. 

  • Like 1
Link to comment
Share on other sites

On 17/09/2024 at 12:36 AM, newsaroundme said:

Do you do parallel programming? That's like the creme caramel of programming for me. I guess all software engineers do (but not all coders).

There are essentially three models of concurrent processing (ex. quantum computing): multithreading, fork/exec, and now gpu programming. 

 

I am learning of the latter but, yes have previously been involved in the two former. 

 

The best way to learn it is to join an open source project as a contributor - but be prepared, they are full of egos.

 

Years ago I joined the linux kernel team and mainly focused on bringing the ISO network stack - for me network address translation to the kernel. But I did dabble in kernel process dispatchers..all written in C then

 

 

If you want to learn rust, there is a project starting to natively support rust in the Linux kernel

  • Informative 1
Link to comment
Share on other sites

Was in London the last couple of days, and my fingers aren't designed for phones...


To expand the above post...

 

Yes.. I have had experience at parallel programming, and this is the book that started me off on it: https://www.amazon.co.uk/dp/0134437063?linkCode=gs2&tag=uuid07-21

 

As mentioned above, there are three main types of parallel programming (ignoring quantum computing theory, which is heading towards commericisation):

  • Fork/Join: This is effectively milti-processing. A controller type program will fork a process off, and that process will have its own memory space. All master/slave process communication is peformed using inter-process communication, which is available between otherwise un-related processes. An example of this is, ay Microsoft word. You have a document open. You open a new document from within Microsoft word, and it forks a new word process. The new word proccess has total control of the document and the original one doesn't see it at all.  This example is a little contrived as it is not actually what is commonely referred to multi-processing.. but is just illustrative. Network processors, operating systems, etc, use fork.join a lot. They care considered heavyweght because with each new process spawned, the execution and address spalce, as well as memory, queues, etc has to be set up for the new process.  A real example will be an electronic trading process. The master process will forkl off a pricing process, and signal watching process, and an order placement/execution process. They will use inter process communication to send messages between them to facilitate trading. The master process may be watching over them making sure things are operating smoothly and kill processes, if for example, the market watcher sees the prices going crazy or something.
  • Mutli-treading: This is allowing a single process to branch of finto multiple execution thread. This is considered lightweight as each thread shares the memory and address space of the main process, and only needs to load local to that thread memory. So, they are much faster to initiate, and use less system resources. They do however suffer from rce conditions, where different threads alter the main process memory contents unexpectedly to other threads and causes bugs, which are notoriously difficult to track down. This is because of the way multi-processing works. This form of parallel or concurrent processing is used for sub-tasks of a program. For example, your browswer uses them because it is downloading at the same time as loading a page on Tab 2 while you are reading this post. Although, due to the lightweight nature of threads, they are used more where one would normally use multi-processing where performance is required. Going back to electronic trading, I can have a thread for each of the items listed above, which will run faster and as the memory is shared, will allow much faster communication than inter-program communication. So, if I am building a low-latency (microsecond actions) high frequency trading platform, I would go to the bother of using threads and writing extra code to protect form race conditions.
  • GPU programming: This is basically combining multi-processing and multi-threading in one. The overheads of setting up a GPU process are minimal, becuase the GPUs have  thousands of cores and their own address spalce, execution space etc. You effectively set up the memory areas and attach a thread to a core. Sort of. I am going through it slowly. However, it is not simple programming in higher level languages live Java or c~ or Python, because you need to understand the chipset language or follow a standard language such as CUDA and embed that in your programming.. The cores in a GPU have a very limited instruction set and are designed to only manipulate bit streams (streams of 1s and 0s) into a signal for the monitor, but for simple arithemtic operations, they are perfect as you can run massively parallel simple computations extremely quickly and chain them toapply more advanved algorthms still a lot faster than the CPU set up.

It's great fun, and a challenge - that's for sure.

 

 

 

 

  • Informative 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...