Early Career Compendium

On my Foundation for building teams in operations and analytics for Media, Ecommerce, IoT & Healthcare

Early Years with Moog (1991-1997)
L-R: Houston, Bob Moog, and my first hire Dave Perkins (1994)

As noted in another entry I started as a student of Bob Moog at University of North Carolina at Asheville. After my time as student and research assistant, the early work at the shop ranged from sweeping floors to engineering production prototypes. My first software project outside of college was building an in-house warehouse and inventory management system using Microsoft Access and Visual Basic. As it happened, Bob's son worked at Microsoft which provided the chance to beta the product before it went to market. Since that time I have occasionally worked with various Microsoft teams on pre-release products and technologies, which will show up in anecdotes throughout the site.

Memorymoog internal view - digital boards (left) and analog voice cards (right)

While Moog is known for analog synthesizers, digital technology was always part of the equation. Aside from building theremin instruments and refurbishment of Moog modular systems, there was also the Lintronics modification for the Memorymoog - based on a piggy-backed arrangement of Zilog Z-80 processors. That extensive re-engineering solution was my entree into the world of embedded systems, low-level programming and working extensively with the intersection between hardware and firmware. This experience would lay the groundwork for my foray into automotive telematics and IoT.

Series 91c Theremin
During the first few years "the shop" was just two people, Bob and myself. But with the growth of products and services offered, staff was expanded along with my role there. I started by hiring in several classmates, and by the time my tenure ended more than a dozen employees from a variety of backgrounds that staffed front office, engineering and operations. My second hire remained at Moog Music for more than fifteen years - eventually becoming their head of engineering as Moog Music re-emerged as a global music instrument powerhouse.

With a principal of Bob Moog's stature, many prototype and exploratory projects were part of the picture. Among them were several capacitive touchscreen products, which influenced my later work for SmartAV and various IoT and embedded projects. it was a one-of-a-kind formative experience to work with such a broad swath of technologies and build a team with a diverse skill set. But beyond that, Bob's insistence to remain a hands-on engineer and conscientious mentor is my strongest memory from that time.

Drake Software (1997-1999)

Also nestled in the mountains of western North Carolina, Drake Software started as an old-school accounting practice that became a platform-of-choice for certified accountants - as the first company to enable transmission of electronic tax returns in the US. Based on my background with Visual Basic I joined to work on the Federal Fiduciary and California modules of the filing application.

I also worked on other projects at Drake, including the first SFTP application for e-filing 1040 tax forms directly over the internet. Up to that time, electronic filings were made through dial-up connection. There was security and pen-testing of the application, followed by formal certification with the IRS. That applet not only transformed the way Drake worked, the secure filing application was also provided to other servicers to use for transmitting returns to the IRS on behalf of their clients. I had worked on shared projects at university, but this is the first time I had seen application sharing "in the wild". It was a progenitor to open source projects we all take advantage of today.

During that time the company was also awarded a contract to stand up a backup and disaster recovery site for several IRS filing centers. I was raised in a family-owned general contracting business, so I had been around commercial construction all of my life - and saw an opportunity to help with this project. That shifted my role from software development to project management, and not just for high-speed data connections and racking and stacking of servers. It also dealt with laying independent power lines through separate walls of the building and installing climate control and fire suppression systems. It was experience that informed several of my later roles - and of course gives me a keen appreciation for public cloud infrastructure available today.

IBM Global Services (1999-2000)

With the rising spectre of "Y2K" I shifted to the Research Triangle Park of North Carolina for a contract assignment with IBM Global Services. The role was to lead teams in managing database code and data migrations for large DB2 systems underneath their partner e-commerce platform. It managed inventory and sales around the world for everything from work laptops to central institutional compute platforms. We were doing DB devops long before it was cool.

It was also a first for me in dealing with globally-scaled solutions, with deployment of their partner e-commerce systems around the globe managed from the office in the Research Triangle Park of North Carolina.

By the end of the project I was coordinating the activities of multiple teams in four regions around the world. And while everyone else was celebrating the New Millennium we were sweating it out from midnight in Sydney all the way through to fireworks on the west coast of the US - and well into the next day. There was a 70+ hour day over that New Year that I'd rather forget. Beyond the immediate objective, that library of bash scripts and make-shift repositories became a model that GS would template out to other working groups.

We rebuilt many database elements from replayed log files - essentially projections and aggregates of database process logs. It was an early form of event sourcing, again, well before technology would "catch up" to the concept as it has in the past few years. After the project, IBM sent two of my leads and myself around to various IBM offices throughout the US to share how our cobbled-together system worked. We also supported those teams as they implemented our pattern within their own infrastructure. It was an early distributed systems experience that, while successful, also begged for a more cohesive tool chain as well as improved scale in compute and storage we all enjoy today.

Startup Incubation (2000)

Like so many others I had been bitten by "the Internet bug" and went to work at a shop that helped new businesses manage their presence online. While they were still project-based assignments, there was the added incentive of stock option grants with each company where you were "embedded" as a contributor. The structure of those early deals were different than they are now but the value proposition is the same. If you delivered in the early stages, you were rewarded in the longer-term. It also gave the individual contributor a chance to experience a wide variety of technologies and observe many businesses as they tried to make it past seed funding and ramped toward profitability. It was a crucible of business development.

As happens with small companies, the role usually starts with a simple mandate. But then business demands impose reality on those plans - then comes the big pivot. It's after that pivot that the success or failure of an early-stage company's fate is determined. One such project was an aftermarket remote car starter. The core tech was a re-purposed garage door transponder and designed to be field installed by certified mechanics. With my hardware experience with Moog, and working on a fleet of vehicles for my dad's business, I simply leapt in and got to work. A similar situation occurred with the early days at 1-800-flowers, where we were expanding the database while also fielding calls from local florists to route orders. Fun times.

The most memorable project was a portable media player with an online music preference curation system. On reloading the player data was retrieved with all user interaction history that tracked behaviors on the playlist and track level. There was overt signaling on the device - a hardware version of the user like/dislike button. But there was also some inferred metrics that were processed server-side. As an example, if the user re-wound to play a track, or fast-forwarded through a selection in the first few seconds, that was recorded as a "like" or "dislike" respectively. There were other subtleties, but those simple scenarios describing implied preferential behaviors were the doorway to deeper understanding.

The end result was an adaptive, intelligent system that grew more responsive as users cumulatively interacted with songs and playlists in cohort. It was a chance to develop my college research in reinforcement learning while also leveraging my experience with audio hardware. While the dedicated player didn't last, the server side processing for playlist curation was acquired by Yahoo and became a key component to their "Launch!" music platform.

Siebel Systems (2000-2001)

While Siebel is thought of as a California company, I joined their acquired office in North Carolina - an auction company named OpenSite. The project was to re-brand and integrate the OpenSite offering into Siebel Software as the "Interactive Selling Suite". This meant a considerable amount of code refactoring as well as developing a body of work for helping current Siebel customers with integration. For that purpose my team was tasked with standing up a sizable local data center in order to validate the various combinations of application operating systems and database types that Siebel supported.

The events of 9-11 shook the foundation of the economy and American identity, and suddenly we found that we couldn't afford to make "the best" the enemy of "the good". One of the short-term software projects my team delivered was a framework I dubbed "mimicked objects" which helped to streamline existing customers in standing up data and systems integration between extant Siebel systems with the Interactive Selling Suite. This hardware and software agnostic platform allowed nearly any Siebel OS/DB combination to be up and running in a matter of days, as opposed to the previously established weeks-to-months-long delivery windows.

But the most significant aspect of my time at Siebel was finding my next position with TherapyEdge - just down the street - one of the most significant experiences of my career.

TherapyEdge (2001-2002)

When I was working in the Research Triangle Park there was a regular daily occurrence that was as consistent as it was mildly entertaining. A collection of tech types would be traipsing along with a quick-footed gent talking and gesturing just slightly ahead of them - bespectacled and always in sharp business attire - except for his well-worn pair of running shoes. That was Robert Cymbalski. He led walking meetings with his managers as they were working out the finer points of a medical software device designed to make the "shifting sands" of AIDS treatment more palatable for patients and more manageable for the professionals that cared for them. A few months after my first sidewalk sighting of their meetings on the move, I had joined TherapyEdge as Manager of Validation and Verification.

The name "Cymbalski" is not as widely recognized as "Moog", but the body of work is no less ubiquitous. If you have a mobile device in your pocket, or a microprocessor-driven device on your desk, it contains more than one patented technology developed by "Bob" (to his friends) Cymbalski. But at that time Bob was the VP of Software Development at TherapyEdge, having just come from a shop across the street - Red Hat - after guiding their platform through certification for use on IBM systems. And while TherapyEdge would only last another year or so as a company, Bob's approach to agile software patterns and practices as well as team recruiting and development continues to inform my approach to technology management. I was able to leverage my experience with reinforcement learning - and for the first time, sharpen skills around mathematical optimization that up to that point I had only exercised in a college course - to help harden the system and provide data and work product to regulatory authorities.

TherapyEdge leveraged a diagnostic that was unique to its platform, a patented virtual phenotype test. The objective was to check for various strains of the virus showing resistance to treatment in a given patient. There were related elements to the regime which included checking medication tolerance, ARV to non-ARV interaction, therapy adherence and co-morbid conditions. As Manager of V&V I had both technical and compliance mandates. My group was responsible for the drug interaction engine, which included active ingredients for tens of thousands of prescription and over-the-counter medications. It was big data with a sharp point at each vertex, because if a certain value was missed or miscalculated it might harm a patient. So the level of effort in ensuring the system properly managed data and outcomes was the most significant of my career up to that point.

Bob was adept with eXtreme Programming with a keen sense in measuring value and risk against delivery velocity. That experience tempered the more doctrinaire elements of lean software development and cascaded through how I and the other managers directed the various working groups. There were many of moving parts to this solution, and coordination and context was key. Aside from technical responsibilities there was also compliance communications with FDA toward gaining certification for use as a medical software device. There too Bob supported my extended negotiations with the FDA. And with that process our company earned trust with the FDA and we eventually won approval as an unregulated medical software device - the first of its kind.

My time at TherapyEdge was an intense experience that still holds a constellation of lessons which I continue to rely on as part of my current team management practice.

Digex (2002-2003)

Upon leaving TherapyEdge I relocated to the Washington D.C. area and took a management position at Digex. At the time, private cloud companies of this sort referred to themselves as "application service providers" and Digex had some big customers, including the US Department of Defense.

I joined a working group that had built secure deployment systems for the rack-and-stack array of "pizza-boxes" today referred to as "blades". What was unique about the solution was the use of front-loaded secure cards to rapidly re-image servers in order to redeploy them into other parts of the data center. This could include "bare metal" level code as well as security profiling information to configure the machines for deployments "higher in the stack".

This system was unique for its time, and enabled reuse of hardware. My team was responsible for provisioning and deployment using the extensive Siebel system tailored for data center workflows. We built middleware to coordinate asset and personnel tracking between those systems, including permissions and authentication to track the "smart cards" to individual personnel. And moving beyond initial deployment, we also worked on how to expand the client footprint for high-utilization periods. We also worked with peer working groups on business continuity to build backup site "dark" deployments, for faster recovery. This brought together my inventory management experience at Moog, data center experience at Drake, migration work at IBM and fused it all together with my compliance and certification experience with the IRS and FDA. In this case the security domain demanded a FIPS security profile and CIS benchmarking. It also meant an active management posture that went above and beyond the standard client e-commerce deployment.

This bare-metal posture may seem quaint or antiquated to cloud engineers today. However, having a deep understanding of these systems can significantly inform the approach to high-security-postured solutions regardless of their top-level architecture. Knowing how ephemeral compute and "serverless" might be knee-capped by choices lower in the stack often have significant impacts on performance and security. That experience is among the reasons I proffer my early experience here. Tempering those lessons with current technology informs many of the cloud computing solutions I architect today.