OpenAI Dev Day: 3 winners and 3 losers

OpenAI Dev Day: 3 winners and 3 losers

With a slew of announcements at OpenAI’s first Dev Day, nearly a year after the launch of ChatGPT, it’s hard to say which is the bigger deal. Take your pick among these:

  • Now trained on a knowledge base through April 2023

  • Expanded multi-modal capabilities, like speech to text (and previously rolled out image and vision capabilities)

  • Massively expanded context window

  • Improved accuracy

  • Cheaper token cost for developers

  • No-code prompt-based GPT development – engineering for everyone

  • More integrations with other tech tools

  • The end of prompting: Custom versions of ChatGPT that taken on certain roles and functions – meed a negotiator? Yup. A creative writing coach? That, too. Just pick from the 16 pre-developed GPTs – when the GPT Store eventually launches, it will feature hundreds of custom GPTs developed by OpenAI as well as developers everywhere (who will be paid for their creations). 

It was non-stop launches, this is just a partial list.

There are a lot of wins here. But with any development, and particularly with the black box of AI, there are some cautionary areas and, frankly, some losers in the announcement. Here’s a few of the winners and losers, in no particular order. 

Some of the generative AI winners

  1. Well, everyone, really, and more specifically, the everyday everyone. Custom GPTs are a huge game-changer, making ChatGPT more accessible and reducing barriers to usage. No more needing to hone your prompting skills – these custom GPTs have done that for you. For example: 

2. End-user enterprises of all sizes that need custom GPTs – and especially since OpenAI has developed a “copyright shield” to protect against IP violations 

3. Entrepreneurs and creatives and anyone building something in need of low-cost, high-speed assistant support to build it out. 

Some of the generative AI losers

  1. Startups built around prompt engineering. While minimal venture capital poured into companies that were exclusively focused on prompt engineering, plenty of businesses popped up to offer prompt libraries, prompting instruction, and custom generative AI creations – like Character.ai, for example – that may be having a very bad day today as their valuations and value propositions get seriously reconsidered.

  2. Similarly, anyone who was developing and expertise and a career as a “prompt engineer.” While ChatGPT will still require prompting, the announcements made clear it is getting exponentially more intuitive and less in need of talking to like a 5th grader.

  3. LLMs that don’t have a defensible moat. This was true before November 6th, but it’s even more true today. These models have value for end users because of the data they are trained on. To that end, even OpenAI is at risk, if Microsoft ever pulls the plug on access to its knowledge base. 

And that's not all.

There’s a lot of breathless coverage and commentary out there about this release, but as OpenAI co-founder Sam Altman put it in his closing remarks, "What we launched today is going to look very quaint relative to what we’re busy creating for you now." OpenAI is rolling out new technologies in a “gradual, iterative deployment” – for safety reasons, he said. So what’s hiding behind the curtain? I guess we’ll find out next November.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics