Actually my involvement with computers goes back a lot farther than that. I learned to write programs in ALGOL using punch-cards on a Burroughs B5500 back in 1967. I thought I was on the bleeding edge when the university installed a time-sharing system around 1969 using Teletype ASR-33 terminals and BASIC, since you could actually interact with the computer personally in real time (with the punch card system, you had to submit your "deck" to the computer nerds in white coats to run your program.
Around 1974 I was writing programs in FORTRAN using another time-sharing system running on an IBM 360 at Carnegie Mellon, then got my first computer - an Apple ][ in 1978. Here's a scan of the faded receipt. You can't even read the name of the computer anymore but that 4546 was the serial number and $1225 was a LOT of money back then!
Got my first Mac (a 512k "fat Mac") in 1985 when I started working at SUNY. Things have come a long way since then, but I'm not so ure that it's all good. Back in the old days we called it the "personal computer revolution". The "revolt" was against what was often called the "data center priesthood" where ordinary people were completely dependent on them for computing tasks. Today, we seem to have come full circle with all this stuff about "the cloud". Is this the "revenge of the priesthood"? We seem to be coming back around to the point of being completely dependent on data centers for most of what we do on our computers...