@drawohara

they say i am a big fat 🤓 – and they are right!

for some strange reason, this makes me happy

as you can read on my about page, i have written too much software

i got my start researching in NOAA

for C.I.R.E.S

while studying at CU’s College of Engineering & Applied Science

wut?

well, basically, the university has a program that donates young scientists to other research institutes, to help do science.

the goal is getting the university’s name on papers which, if you know anything about science, is gold. publish or perish… etc. publishing == funding.

my first big project, which i completed with my friends:

was a cutting edge bit of software. essentially, we wrote a J2EE (java. yep) app that was deployed on linux field computers. the purpose, was to analyze weather data, in the field, while on large forest fire suppression projects, to make go or no-go calls. by this, i mean the “local weather report needed to know if i will kill this team by sending them up this canyon, right now”

this was pre-ruby, pre-common-js, and pre-iphone. we wrote a crazy thing that worked on bizarre linux filed computers using satellite internet which, at that time. was way, way out there.

i also wrote a ‘golf app’ for blackberry but… ok. another story!

anyhow, we worked very, very hard on this - i didn’t miss a single day at the engineering center in a year - and it turned into a job with C.I.R.E.S for me.

subsequently, i went to work at FSL (Forecast Systems Lab) doing hyper-high-availability (5 9s ((99.999 % uptime))) for operational satellite ingest systems.

we designed cutting edge systems and novel… brutal… methods of ensuring consistency of data such as STONITH, which stands for “Stone The Other In The Head”. this was the way we would ensure a single lead systems would know all the subordinate systems (we used to call these ‘master’ and ‘slave’ but that has, thankfully, gone out of fashion….)

anyhow, basically, when taking over ownership of an important system we couldn’t rely on such simple measures like concencus, or some other arrangment, when you take over control, literally toggle the power of the other system you are taking over: stone it in the head! in this way, we could be damn sure that our code worked, and acheive 5 9s. of uptime. if you are wondering, we actualy built hardware to support this and, at that time, were doing some of the first ha-postgress systems in the world.

i also did a lot of work in verification.

geophysical models take hundreds, or thousands, or even hundreds of thousands, of configurations. to run. people talk about how neat 12-factor is now, and i just shake my head… what if you had to manage millions of configuration values. the next trick is version them, so we know how they change over time because, as scientists, if we make a change to say, a cloud physics model, we need to ‘test it’. but

how do you test software, when you don’t now the ‘right answer’?

ain’t that relevant now?!

the approach is actually theoretically simple:

some people know it as the scientific method (air-quotes). basically, you make your change, hold all other variables the same, and look for changes. in the case of storms you might re-run 10 years of weather predictions, each at the 0hr, 1hr, 8hr, 24hr, etc. predictions, and then analyze all that data,

to see if

“it got better”

what do we mean by “better” you ask?

good question!

this reminds me, very much. of the current 🌎 situation, in which hundreds of thousands of well-meaning and progressive technologies have, despite linters, test-suites, cults like xp and agile.bs, who have all been, literally screaming to the world about how hard thier job is, and how powerful thier stuff is, and how bad you need to buy it so we can have an IPO, have suddenly thown out all need to test anything at all, understand the outputs, the ramifications, or anything. we just ask the oracle! (this is AI for those of you non-nerds)…

software development is presently one of the biggest existential threats to democracy

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

, the economny, and the environemnt, let alone human beings, because of this trend. i have heard PhD level programmers refer to AI as a universal API, claiming they can replace ‘anyone’ via this ability to distill an API from thin air, and then program a human (we call these “agents”) to use the silly think. so, let me say this back to you slowly, and in english:

software developers, ranging from ones that work in social media, self driving cars, ecnonimics, space-weather, video games, etc. have decided the following is statement is valid:

” We can pattern match ANYTHING (rawr!), if only we pump enough data into it!

ergo…

GIVE US ALL THE DATA AND FEAR ME!

now, here is the hitch. who says what all this data is? where does it come from? well, i will tell you one thing: they aren’t telling us. instead, they are selling us. and by this i literally mean they are selling me to you and you to me. the thoughts and communications we share these are what they capture and sell. good time to re-watch the matrix, btw….

consider just two companies.

1. google

the ‘deep research feature’ one can use in version 1.5 is amazing. ask it to research, it downloads a tiny, relevant, subset of the interwebs, it compiles a little RAG database for you, and informs your questions to the oracle with that. neato!

here are, but a few small problems with this:

2. openai

… more on the real evils of this company soon but, i wouldn’t send them my data. personal, or business. just sayin.

3. so, who should i use?

https://mistral.ai/ … more on this soon too. sooooo much to write!

but, we are way of topic.

back at FSL, i continued working in the area of complex notions of ‘correctness’, and ended up developing novel database platform at the time, called ‘bitemporal postgresql’. some remains of it live on the interwebs. think version control on steriods. enough about that.

my next stint was at The National Geophysical Data Center, where i helped write a bunch of papers:

what do i mean by ‘help’?

mostly, i built very, very large super-compute, essentially big fat map-reduce style computing but, at the time, neither of those terms existed. we had to invent novel ways, of moving our code of off big-endian (not spelled wrong) cray (also not spelled wrong) machines and onto tons of commodity hardware. namely, hundreds of linux boxen.

i also did a ton of work around clustering… very low level c/c++ code, using ideas from signal processing and computer vision, to detect the edges of cities via a process similar to the watershed algorithm but… at scale.

i also got to release piles of open source software at NGDC and, for this alone, i am very grateful.


__… breathe.. __

next, this crazy mofo hired me. to compile the GNU scientific library on.. wait for it…

windows!

wowza i am old!

anyhow, Greg worked for Don Springer, at company called Collective Intellect. which, at the time, was the “Mobius Group” (which would eventually become The Foundry Group and… #BOOM .. start-ups in Boulder, Colorado, were a thing.

it was fun time.

it was after this that i started dojo4, which was the crown jewl in my life as a geek, for many reasons i hope to write about soon. including close to ten years mentoring techstars companies where, i have made some super duper great friends.

until then, i will say, as i always do that:

(if you do find an errer (heh), then ping me or submit a pr as this is all in gh)

and … fucking ads!, my new company, aims to remedy some, but not all, of the symptoms i am watching the ‘AI movement’ go through…

with more than a little bit of horror.