pmccarthy Posted September 16 Posted September 16 Those of us who studied through UNSW around 1970 learned FORTRAN 4 using the Watfor compiler, which I think was created by one of their researchers. We then moved on to BASIC. Fortunately, at the time there were not many languages in use for the non-machine language peasants. 2 1
newsaroundme Posted September 16 Posted September 16 13 hours ago, Jerry_Atrick said: A coder? tsk tsk tsk... I was a software engineer, or developer, or systems programmer - but coder???!! 😉 A few on here are, or were. Surpised you don't have COBOL from university in that era... I still dabble - mainly Java and C/C++ these days... Though will have to learn Rust (not of the kind that corrodes metal!!) as, although it is already over 8 years old, it is slowly going to be overtake C and C++ because it is both a low level language and is memory safe - alledgedly anyway. I meant coder. That's someone who codes or can code independent of university degree. If you're a software engineer I would call you a coder then that should know what he's doing. I had friends that didn't complete formal qualifications before they were hired by Google or other companies because of their open source experience (coders). I like that it sounds like you know what you're doing. I'm interested in knowing more about Rust now. Thanks for that. I'll have a look. Do you do parallel programming? That's like the creme caramel of programming for me. I guess all software engineers do (but not all coders). 1
newsaroundme Posted September 16 Posted September 16 2 hours ago, pmccarthy said: Those of us who studied through UNSW around 1970 learned FORTRAN 4 using the Watfor compiler, which I think was created by one of their researchers. We then moved on to BASIC. Fortunately, at the time there were not many languages in use for the non-machine language peasants. I'm grateful to be here with people that were coding in the seventies. I was only a decade later but I know what it was like. I appreciate the older technologies. 1
newsaroundme Posted September 16 Posted September 16 Without those older technologies we might not have these newer technologies. 1 1
red750 Posted September 17 Author Posted September 17 I've forgotten most of what I knew - and I didn't know much. I never had any formal training. I taught myself some simple Basic programming from the courses(?) in computer magazines, with some software on 3.5 inch floppies stuck on the front. I was employed in the data centre of the Bank as a reconciliation clerk, balancing the batches and re-entering rejected vouchers. I moved on to operate an old Burroughs 300 computer which you controlled by entering instructions one character at a time in binary, by pressing buttons on the console. B300 operator console Operator panel A complete B300 system. I progressed to managing a data centre in NSW as a satellite to the main operations centre in Melbourne. After a couple of years, I was transferred to the systems development department. As I had no training or qualifications in this area, I was designated as a systems analyst purely on the basis that the salary was the closest to what I had been on as centre manager. I was given a binder, about 6 inches thick, of a program listing in COBOL. I was told there was a problem with the program and was told to read the code and locate the error. I had access to the programmers when I needed assistance. Basically, that was my training. Later, the bank merged with another, and changed its systems to run on PL/1. I was required to write specifications for merging the systems. In another job a few years later, I had to take over maintenance of a reference system written in HTML for an intranet system for call centre operators. I ended up rewritng the system when the company was sold. That was over 15 years ago, and I haven't touched any programming since. 2 2
newsaroundme Posted September 18 Posted September 18 On 17/09/2024 at 10:34 AM, red750 said: I've forgotten most of what I knew - and I didn't know much. I never had any formal training. I taught myself some simple Basic programming from the courses(?) in computer magazines, with some software on 3.5 inch floppies stuck on the front. I was employed in the data centre of the Bank as a reconciliation clerk, balancing the batches and re-entering rejected vouchers. I moved on to operate an old Burroughs 300 computer which you controlled by entering instructions one character at a time in binary, by pressing buttons on the console. B300 operator console Operator panel A complete B300 system. I progressed to managing a data centre in NSW as a satellite to the main operations centre in Melbourne. After a couple of years, I was transferred to the systems development department. As I had no training or qualifications in this area, I was designated as a systems analyst purely on the basis that the salary was the closest to what I had been on as centre manager. I was given a binder, about 6 inches thick, of a program listing in COBOL. I was told there was a problem with the program and was told to read the code and locate the error. I had access to the programmers when I needed assistance. Basically, that was my training. Later, the bank merged with another, and changed its systems to run on PL/1. I was required to write specifications for merging the systems. In another job a few years later, I had to take over maintenance of a reference system written in HTML for an intranet system for call centre operators. I ended up rewritng the system when the company was sold. That was over 15 years ago, and I haven't touched any programming since. Wow. There's a lot of history there. I'm fascinated by the old machines. I liked it when you didn't have to have a degree for everything too. 1
Jerry_Atrick Posted September 18 Posted September 18 On 17/09/2024 at 12:36 AM, newsaroundme said: Do you do parallel programming? That's like the creme caramel of programming for me. I guess all software engineers do (but not all coders). There are essentially three models of concurrent processing (ex. quantum computing): multithreading, fork/exec, and now gpu programming. I am learning of the latter but, yes have previously been involved in the two former. The best way to learn it is to join an open source project as a contributor - but be prepared, they are full of egos. Years ago I joined the linux kernel team and mainly focused on bringing the ISO network stack - for me network address translation to the kernel. But I did dabble in kernel process dispatchers..all written in C then If you want to learn rust, there is a project starting to natively support rust in the Linux kernel 1
Jerry_Atrick Posted September 19 Posted September 19 Was in London the last couple of days, and my fingers aren't designed for phones... To expand the above post... Yes.. I have had experience at parallel programming, and this is the book that started me off on it: https://www.amazon.co.uk/dp/0134437063?linkCode=gs2&tag=uuid07-21 As mentioned above, there are three main types of parallel programming (ignoring quantum computing theory, which is heading towards commericisation): Fork/Join: This is effectively milti-processing. A controller type program will fork a process off, and that process will have its own memory space. All master/slave process communication is peformed using inter-process communication, which is available between otherwise un-related processes. An example of this is, ay Microsoft word. You have a document open. You open a new document from within Microsoft word, and it forks a new word process. The new word proccess has total control of the document and the original one doesn't see it at all. This example is a little contrived as it is not actually what is commonely referred to multi-processing.. but is just illustrative. Network processors, operating systems, etc, use fork.join a lot. They care considered heavyweght because with each new process spawned, the execution and address spalce, as well as memory, queues, etc has to be set up for the new process. A real example will be an electronic trading process. The master process will forkl off a pricing process, and signal watching process, and an order placement/execution process. They will use inter process communication to send messages between them to facilitate trading. The master process may be watching over them making sure things are operating smoothly and kill processes, if for example, the market watcher sees the prices going crazy or something. Mutli-treading: This is allowing a single process to branch of finto multiple execution thread. This is considered lightweight as each thread shares the memory and address space of the main process, and only needs to load local to that thread memory. So, they are much faster to initiate, and use less system resources. They do however suffer from rce conditions, where different threads alter the main process memory contents unexpectedly to other threads and causes bugs, which are notoriously difficult to track down. This is because of the way multi-processing works. This form of parallel or concurrent processing is used for sub-tasks of a program. For example, your browswer uses them because it is downloading at the same time as loading a page on Tab 2 while you are reading this post. Although, due to the lightweight nature of threads, they are used more where one would normally use multi-processing where performance is required. Going back to electronic trading, I can have a thread for each of the items listed above, which will run faster and as the memory is shared, will allow much faster communication than inter-program communication. So, if I am building a low-latency (microsecond actions) high frequency trading platform, I would go to the bother of using threads and writing extra code to protect form race conditions. GPU programming: This is basically combining multi-processing and multi-threading in one. The overheads of setting up a GPU process are minimal, becuase the GPUs have thousands of cores and their own address spalce, execution space etc. You effectively set up the memory areas and attach a thread to a core. Sort of. I am going through it slowly. However, it is not simple programming in higher level languages live Java or c~ or Python, because you need to understand the chipset language or follow a standard language such as CUDA and embed that in your programming.. The cores in a GPU have a very limited instruction set and are designed to only manipulate bit streams (streams of 1s and 0s) into a signal for the monitor, but for simple arithemtic operations, they are perfect as you can run massively parallel simple computations extremely quickly and chain them toapply more advanved algorthms still a lot faster than the CPU set up. It's great fun, and a challenge - that's for sure. 3
newsaroundme Posted September 22 Posted September 22 On 19/09/2024 at 5:27 PM, Jerry_Atrick said: Was in London the last couple of days, and my fingers aren't designed for phones... To expand the above post... Yes.. I have had experience at parallel programming, and this is the book that started me off on it: https://www.amazon.co.uk/dp/0134437063?linkCode=gs2&tag=uuid07-21 As mentioned above, there are three main types of parallel programming (ignoring quantum computing theory, which is heading towards commericisation): Fork/Join: This is effectively milti-processing. A controller type program will fork a process off, and that process will have its own memory space. All master/slave process communication is peformed using inter-process communication, which is available between otherwise un-related processes. An example of this is, ay Microsoft word. You have a document open. You open a new document from within Microsoft word, and it forks a new word process. The new word proccess has total control of the document and the original one doesn't see it at all. This example is a little contrived as it is not actually what is commonely referred to multi-processing.. but is just illustrative. Network processors, operating systems, etc, use fork.join a lot. They care considered heavyweght because with each new process spawned, the execution and address spalce, as well as memory, queues, etc has to be set up for the new process. A real example will be an electronic trading process. The master process will forkl off a pricing process, and signal watching process, and an order placement/execution process. They will use inter process communication to send messages between them to facilitate trading. The master process may be watching over them making sure things are operating smoothly and kill processes, if for example, the market watcher sees the prices going crazy or something. Mutli-treading: This is allowing a single process to branch of finto multiple execution thread. This is considered lightweight as each thread shares the memory and address space of the main process, and only needs to load local to that thread memory. So, they are much faster to initiate, and use less system resources. They do however suffer from rce conditions, where different threads alter the main process memory contents unexpectedly to other threads and causes bugs, which are notoriously difficult to track down. This is because of the way multi-processing works. This form of parallel or concurrent processing is used for sub-tasks of a program. For example, your browswer uses them because it is downloading at the same time as loading a page on Tab 2 while you are reading this post. Although, due to the lightweight nature of threads, they are used more where one would normally use multi-processing where performance is required. Going back to electronic trading, I can have a thread for each of the items listed above, which will run faster and as the memory is shared, will allow much faster communication than inter-program communication. So, if I am building a low-latency (microsecond actions) high frequency trading platform, I would go to the bother of using threads and writing extra code to protect form race conditions. GPU programming: This is basically combining multi-processing and multi-threading in one. The overheads of setting up a GPU process are minimal, becuase the GPUs have thousands of cores and their own address spalce, execution space etc. You effectively set up the memory areas and attach a thread to a core. Sort of. I am going through it slowly. However, it is not simple programming in higher level languages live Java or c~ or Python, because you need to understand the chipset language or follow a standard language such as CUDA and embed that in your programming.. The cores in a GPU have a very limited instruction set and are designed to only manipulate bit streams (streams of 1s and 0s) into a signal for the monitor, but for simple arithemtic operations, they are perfect as you can run massively parallel simple computations extremely quickly and chain them toapply more advanved algorthms still a lot faster than the CPU set up. It's great fun, and a challenge - that's for sure. Thank you for expanding on your reply. That book looks interesting. I am still playing with python and am looking at the field of data science. I'm interested in parallelism for solving data problems. What you wrote reminded me of parallel programming with C. 1
newsaroundme Posted September 22 Posted September 22 For some reason I recall it working differently in Ada. I'm going now to look that up.
Jerry_Atrick Posted September 22 Posted September 22 (edited) I have never written in Ada, so not sure how it works. (Note, I only touched the tip of the iceberg above). One of the things to look at is the use of GPUs.. Effectively a bitstream in, calcs, and a bitstream out... but for simple computations, can massively scale the power available. Here is an article for Python developers: https://medium.com/@geminae.stellae/introduction-to-gpu-programming-with-python-cuda-577bfdaa47f3 CUDA is one of two main frameworks for GPU programming; the other is OpenGL. They are both used for AI and ML extensively. And also, native Python is slow.. Really slow. So the libraries like numpy, panda, cupy (for CUDA) are often written in C++ Edited September 22 by Jerry_Atrick
spacesailor Posted October 1 Posted October 1 Please, Not the pacemaker . Nor that " CPAP " machine . Oops so sorry G G . We will miss you . spacesailor
red750 Posted October 10 Author Posted October 10 Just read this on the internet: First noticed by MacRumours, the products added to the list made obsolete are its last ever iPod Nano and iPod Shuffle models. Essentially it means if the product stops working Apple is not obliged to fix it – rendering it barely more than a relic from the past. Apple will no longer issue IOS updates for this model. Apple has also added the iPhone 6 standard model to the list, having already made the larger iPhone 6 Plus obsolete back in April. My daughter and I both have iPhone 6's. They wont stop working, but if something goes wrong we will have to find an independent phone repair centre who will fix it. iPhone 7, iPhone 8, iPhone X, IPhone 11, iPhone 12 and iPhone 13 have all been discontinued in recent years. According to Apple, it considers its products obsolete when it stopped distributing them for sale more than seven years ago. 2
Jerry_Atrick Posted October 12 Posted October 12 (edited) @newsaroundme - how is your Pythin coming along? There have been changes at my work, where although I run a risk technology product team (meaning, effectively business analysts to deliver risk management systems), with new senior management, we are going to change a lot over the next couple of years - not that that will impact me as I won't be there too much longer. But one of the things my new manager has hinted at is that she wants us to be both product and development teams, and with AI looming, wants us to pick up Python and the various libraries. As I am sick of management (I have already told my manager I am not interested in a more senior role, as the current one I have is not what I signed up for), I am looking forward to using company time to learn advancec Python and put together an application framework along the lines of numpy, pandas, matplotlib, and scipy, but wondering what AI libraries you are using in your new quest? Edited October 12 by Jerry_Atrick 1 1
newsaroundme Posted October 13 Posted October 13 12 hours ago, Jerry_Atrick said: @newsaroundme - how is your Pythin coming along? There have been changes at my work, where although I run a risk technology product team (meaning, effectively business analysts to deliver risk management systems), with new senior management, we are going to change a lot over the next couple of years - not that that will impact me as I won't be there too much longer. But one of the things my new manager has hinted at is that she wants us to be both product and development teams, and with AI looming, wants us to pick up Python and the various libraries. As I am sick of management (I have already told my manager I am not interested in a more senior role, as the current one I have is not what I signed up for), I am looking forward to using company time to learn advancec Python and put together an application framework along the lines of numpy, pandas, matplotlib, and scipy, but wondering what AI libraries you are using in your new quest? I'm having fun. I successfully used the API to Gemini and have played a little with plotting data using matplotlib for example. It's so exciting that there's so much you can do with Python. I've gone back to basics though. I've been revising algorithms (and needed to). I've also been playing with Django a little bit don't want to get too distracted playing with websites until I've mastered Python myself. Thanks for asking. I've got a bit more to do today because I took a break yesterday. I've written a card playing application and have been having fun with classes and inheritance. I was really impressed playing with AI and the API. I'd not had a chance to do it before. 1
willedoo Posted October 29 Posted October 29 My trial of phone data vs the NBN is going ok so far. The phone connection is set to mobile hotspot and wireless from there to the laptop. I've paid for another month on the NBN plan to keep it active until I decide whether or not to ditch it for good. The idea is to do the whole month using only the phone to get an idea of reliability and speed. I've been doing it for a week now and it's passed those tests. The economics and value for money side of it is already well in the phone's favour. So far the occasional times the phone slows a bit or has a drop out for a few seconds is not a problem.
spacesailor Posted October 29 Posted October 29 (edited) My NBN has never gone awhole month without an ' outage ' . Also pay an extra $ 10 for the house phone. And an Aldi mobile card for me & the same for the wife . So communication is not cheap . I should rig up a C B twoway radio station . spacesailor Edited October 29 by spacesailor
willedoo Posted October 29 Posted October 29 With the NBN I get 100GB of data per month with no monthly rollover of unused data for $50 per month. With Aldi for the same price I get 130GB of data, unlimited phone calls and sms in Australia and 30 other countries, plus any unused data rolls over and accumulates. For a single user like myself, the phone data plan so far seems cheaper, faster and more reliable than the NBN. I would think in a multiple user household, the NBN plans would be more suitable.
Jerry_Atrick Posted October 29 Posted October 29 1 minute ago, willedoo said: I would think in a multiple user household, the NBN plans would be more suitable. I am not si sure abou that. If you are hotspotting a device to your phone and it is faster than NBN, chances are it will handle multiple devices (depending on the phone) faster because the connection is faster, and phones have reasonably powered chips tha would rival most domestic use wi fi routers for multiplexing. Of course, you can't ethernet cable to a phone, but how many people use ethernet cable these days? My son and I wer streaming a movie from 4g in London, and it was perfectly fine - no buffering or anything like that. Was no worse than our home line. If we got relaible mobile signal where we live, we probably wouldn't have a land line, because unlike a land line, there is redundancy in a mobile network. 1
spacesailor Posted October 29 Posted October 29 BUT . We end up with NBN , plus house phone, & two Aldi's simcards . A grandson puts us on his Netflix plan , even though he never get to watch TV . spacesailor
willedoo Posted October 29 Posted October 29 28 minutes ago, Jerry_Atrick said: If we got relaible mobile signal where we live, we probably wouldn't have a land line, because unlike a land line, there is redundancy in a mobile network. Jerry, what does redundancy in the mobile network mean?
willedoo Posted October 29 Posted October 29 Another thing I'm getting used to is WhatsApp. So far I only use it for calls to friends who don't have a very good signal for a normal mobile phone calls. They get a lot of dropouts so use WhatsApp for calls whenever they can.
Jerry_Atrick Posted October 29 Posted October 29 (edited) Multiple cellular towers usually with overlapping range so if one goes down, then you will automatically switch to the other as an end point... Of course, the land line network has redundancy in the broader sense, but of the local line to your house experiences a problem, you are SOL (ship outta luck). Edited October 29 by Jerry_Atrick 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now