With each external cloud service you deploy, you introduce the amount of unreliability that product has into your own product’s reliability (even if it’s incredibly small). Because most vendors refuse to provide visibility into their platforms, we’re left scrambling and asking ourselves, “Is it me or them?”
There’s a lot of hype around AI, and in particular, Large Language Models (LLMs). To be blunt, a lot of that hype is just some demo bullshit that would fall over the instant anyone tried to use it for a real task that their job depends on. The reality is far less glamorous: it’s hard to build a real product backed by an LLM.
Observability isn’t novel, but we’re still grasping the full range of its impact and ability to make organizations more digitally resilient: from faster fixes and fewer outages to greater ROI and complete confidence in apps’ reliability.
A lot has happened since 2022, from the rise of Generative AI to the economic slowdown and job losses impacting data practitioners in 2023. In fact, it’s safe to say that the GenAI hype doesn’t seem to be high enough to counterbalance the influence of the struggling economy, as the demand for data practitioners is now declining, after having risen for the last 15 years.
As a founder you are heads down in the details about your business. You know every piece of the product and can dynamically adapt as you talk to potential customers. Your VC partner will never be as deep as you are, but they have a much broader context. The ongoing way in which your VC builds that context can also useful to you.
In this special Heavybit Speaker Series, Russell Smith, previously co-founder and CTO at Rainforest QA will explore some of the core engineering management concepts that are essential for success in today's fast-paced and highly competitive engineering environment.