feoNotes

Notes and Reflections by feoMike

cryptic flesh

while i in high school i hung up w/ a group of friends who were very smart, and somewhat subversive. these were people from town (i went far away to high school), and a couple of them noticed how fast certain folks grabbed on to a new fad, cool/hip thing or music. so in a small experiment, these folks made up a fake band, fake band tour t-shirts and perhaps even bumper stickers. the band name was ‘cryptic flesh’ and tour dates on the t-shirt included places like ‘toledo oh’ or ‘rochester ny’ and had in red diagonal letters over the cities ‘sold out’. on the front was icons like zombies, and arms reaching up out of the ground. there was this whole theme; it was punk, it worked and it was totally fake. we wore the shirts all the time, and we swear that people came up to us and said, ‘cryptic flesh; sweet. i saw them last week in providence, they rocked’. we laughed it was fun, and became way bigger in my mind than perhaps it was. i think there is an interesting lesson here.

people collectively often get hung up by whats popular; what everyone is doing, what is mainstream and try to repeat it under the premise, well it worked for them, it should work for me. this misguided premise happens consistently in IT. how long have we heard that everything needs to be enterprise? how long have we heard that the only scaleable solution is the cloud? how long have we heard that the only viable way to run IT is to make sure you have a small management set of tools, so your portfolio is manageable? in principle many of these adages are true, the fault is that they don’t always work.

recently i was sitting in a meeting where we were discussing some paths forward for some enterprise tools. in this case, yes, cloud, and a smaller set of larger software tools were discussed. i got slightly frustrated b/c i think we get so tunnel focused on this concept (go to the could and implement xEnterpriseSoftware - problem solved), that we miss the opportunity to develop perpendicular innovative ideas.

let me be specific. in one use case at our (and many federal) offices, there are a group or team of users (lets call them researchers). these folks are super smart and need to crunch big numbers, write complicated models, run them and develop conclusions from these quantitative analysis. the basic business flow, is a) have a hypotheses and design a model which helps test the hypothesis, b) go find the enterprise data base, c) see what variables support the potential analysis, d) extract the data and subsequently make huge tables of copies of data e) hammer some poor server with number crunching, f) interpret results, prepare conclusions and argue for a long time about what they mean, g) keep huge results files for ever of this process; h) repeat again. let me stop here and say these people are super smart, what they do is critical to policy, and they are highly valued. i am not making light of anything they do, it is totally awesome.

here is the rub. the classic response is, IT should support this through centralized tool sets, because we know everyone is going to the cloud, it is scaleable, and enterprise tools work. what we fail in this approach is common sense.

the four factors that commonly affect IT are cost, time, flexibility, and security. if we map each of these factors to this problem set, i come to an opposite conclusion that the solution to every (and this researcher type problem) isn’t more servers. ITS BETTER COMPUTERS + the internet.

regarding cost, lets evaluate. the cost to deploy new servers to keep up w/ the every expanding requirements of the research community far exceeds normal workflow (e.g. active directory + ms exchange + mounted drives which accounts for the lion share of most IT back ends). the research guys need honking disk, on super fast drives, geometrically increasing space etc etc. not only that, in most shops there are a handful of these guys (lets say 4 - ok lying here, but …). many times these people are so fed up with IT that they take on becoming their own sys admins and experts in servers etc, buy their own stuff and become silos. the total cost of ownership for the amount of time for this approach is crazy huge (this diatribe is not a cost benefit analysis, for that we would need a team of researchers … never mind). the cost of my 16 gig 8 core 1 terabyte mac is about ~$2500. its a super computer, that i cary in one hand. on this machine i can crunch easily hundreds of millions of rows/models any which way i like. because it is connected to the internet at no extra cost, i have the full suite of collaboration ability to see how cutting edge people are using new modern toolsets effectively. most importantly my ability to perform analysis is unencumbered by anything at a cost far far far less than server infrastructure given the amount of people it takes to suppor that effort.

time. i am working procurement into this one, so really two factors (time to get tools and time to perform analysis). time to get tools is dependent on procurement/authority to operate/user contraints on software installation/architecture review boards for enterprise software and the list goes on and on. in our shop, we tend to use postgres and maybe stata. in others it might be a different database and a different analytical processor (oracle and r, sql server and sas - take your pick, then perhaps some BI on top of it). the amount of time debating these things, is crazy huge. who cares. they are rows and columns, and a research knows how to manipulate them given however they grew up (sidebar how they are growing up in college now, is i install what i need on my mac and run). second is time to process. think back to any conversation you ever had with some type A geek about the size of their computer. again who cares. if we are doing analysis on 100’s of millions of rows, on a 16 gig machine, multi core processor, it takes time. peace of mind however, that the researcher has full control over the machine to perform the analysis will be far faster with a laptop super computer (eg a mac) than the layers and layers of enterprise crap we typically install.

flexibility is my favorite. since primary school days, i myself have known that my best ideas are never when i am sitting at the desk during the specific times i need to be performing the work. name one person who is. when we start the assumption that everything will be in the enterprise (which loosely translates to on the network, with these credentials with these specific tools with this very specific set of roles etc etc), we by definition limit flexibility. what if i wake up at 4 am with a new idea and want to test on a couple of million rows? on my mac, with my tools, no problem. on some enterprise function vpn, log in, wait, etc, problem fail.

security. in federal government we all want the same thing. really we do. we want to make sure the public is safe. when did public safety be equated to the safest environment is one that is so controlled that no one can access it is beyond me. there is no need to dwell on this one, suffice it to say that not everyone has the same level of threat, and not every problem has the same risk, yet we have a control structure that appears that we do (again, i am being hyperbolic, but…). now imagine the people overhead in IT terms for managing and implementing every NIST security protocol on all servers for every condition, or evaluating a super computer laptop? you do the math.

so back to my high school friends. if we aren’t deeply knowledgable about our own content vertical (weather its 80s punk or software, computing, big data etc), we should stop pretending that we are. i wasn’t at that cryptic flesh show, because there never was one, because there isn’t even a band with that name. IT requires some hefty comprehension that is remarkably different from simply making a web page. the enterprise of yesterday isn’t the enterprise of tomorrow, and any bandwagon we should all be very wary of.