One of the things that’s currently “in vogue” right now is machine learning and anything dealing with artificial intelligence and statistical algorithms in software.
In fact, if you don’t have at least one of those things in your startup pitch deck you may be deemed “out of touch” or irrelevant (this is, of course, a joke…).
What is most fun (and curious) to see and observe is how these types of perspectives have become much more common place and a part of our everyday conversations when, in fact, I’ve encountered them for years (if not decades) being in software development properly.
I suppose we can just chalk it up to one of those “technical” things that eventually hits critical mass and is adopted in our everyday lingua franca – it’s like when I hear someone say “Let’s double-click into that…” and other such professionally-colloquial phrases.
But the reality is that analytics, and especially predictive analytics for software and software engineering teams, has become a much more important part of our conversations and building solutions in and around this particular area is slowly making its way higher on the priority list for many businesses.
Simply put, your company, my company, all of our companies need faster and more effective software delivery systems without sacrificing quality.
And, as you and I both know, the tools that we use today are numerous (and growing) but the infrastructure to support the burgeoning industry of software tooling hasn’t quite kept up.
We all know what this is like based on the simple fact that we have to cycle through many different systems and tools every day to help us make decisions, both large and small. The number of browser tabs that we have open at any given moment is a testament to that reality.
And engineering is no longer just the responsibility of the “engineering department” – cross-functional teams and shared resources up and down the product stack make product building a true team sport. In other words, it’s everyone’s job to ship quality product on time and on budget.
In short, software delivery is more complex than its every been and it’s only getting more complex with the increased number of disparate systems that we use every day.
This is why our thoughts around engineering operations still ring true, although, broadening the scope and definition of how we think about product engineering to include more than just engineering folks probably makes sense moving forward (we’re obviously continuing to iterate on our own verbiage as we get harden our own product and analyze our own hypotheses).
In addition, as part of the solution set that we’re building, predictive analytics is baked right in so that we can identify patterns of the data that we have so that real forecasting can be achieved for accurate decision making.
It starts with collecting the right signal data (and wisely adding additional sources) and then testing that data against the assumptions that the individual, team and the business believes they need to know and act on.
The net result is, as we had hoped, meaningful insights to inform the team how close (or how far off) they are to delivering software products on-time and any of the risks that may or may not be be imminent.
We may not call it “machine learning” or “artificial intelligence” quite yet, but, we can definitely call it analytics with prediction baked inside and although not every product needs to have ML or AI we most certainly believe that all software products and solutions need predictive analytics, especially for product engineering and product delivery.
Consequently, we’re working hard to not only create useful views of the SDLC but also adding the right context to it so that we can arrive at predictive delivery for software. Our next step is to get a few early customers into active trials and spend time with them “on the ground” to learn what their real needs are and how to solve them with our product.
Onward and upward!
Also published on Medium.