October 13, 2023
In this Episode:
Marcel Brown: October 13th, 1983. Ameritech Mobile Communications executive, Bob Barnett, makes a phone call from a car parked near Soldier Field in Chicago, officially launching the first cellular network in the United States.
Edwin Kwan: Patches have been released for two security vulnerabilities affecting the Curl data transfer library, one of which could potentially result in code execution.
Katy Craig: OpenAI, a leading AI startup, is considering venturing into the development of its own AI chips. The reverse integration move aims to reduce dependency on GPU-based hardware, which has been strained by the generative AI boom.
Shannon Lietz: This essentially means that we’re going to see AI be the beginning of the reunification of hardware and software. And ultimately, where I see cybersecurity getting built in is going to be in these mega players.
Olimpiu Pop: An analysis considers that they would need $ 48 billion worth in GPU chips and another $16 billion per year in maintenance costs. That’s quite a pile of money, even for a company with a sack of gold. For this reason, also for the shortage of the GPU chips, OpenAI considers building their own.
The Stories Behind the Cybersecurity Headlines
Edwin Kwan
Curl Patches Worst Security Flaw in Ages
Patches have been released for two security vulnerabilities affecting the Curl data transfer library, one of which could potentially result in code execution.
This is Edwin Kwan from Sydney, Australia.
Earlier this week, the maintainers of Curl announced that two vulnerabilities would be announced later during the week.
Curl is a popular open-source data transfer tool that is used widely by developers and system administrators. It also serves as foundational support for many network protocols like SSL, TLS, HTTP, and FTP. The two vulnerabilities comprise of a high-risk severity, heat-based buffer overflow vulnerability, and a less severe cookie Injection flaw.
The project founder and lead developer has described the high-risk vulnerability as probably the worst Curl security flaw in a long time, and it could potentially result in code execution. The security flaws have been patched in Libcurl version 8.4.0.
In the past, we have seen proof-of-concept exploits become available not long after vulnerabilities are announced, which are shortly followed by mass exploit attempts.
So make sure you keep your system safe by patching immediately.
Resources
– Hacker News: https://thehackernews.com/2023/10/two-high-risk-security-flaws-discovered.html
– The Register: https://www.theregister.com/2023/10/11/vulnerabilities_in_curl_receive_patches/
Marcel Brown
This Day, October 13-14, in Tech History
This is Marcel Brown serving up some technology history for October 13th and 14th.
October 13th, 1983. Ameritech Mobile Communications executive, Bob Barnett, makes a phone call from a car parked near Soldier Field in Chicago, officially launching the first cellular network in the United States.
October 14th, 1884. George Eastman receives a patent on his paper-strip, photographic film. Prior to the invention of photographic film, photography was primarily done using wet plates, which was cumbersome, expensive, and not easy to transport. The invention of photographic film made photography much simpler and portable.
Interestingly, adoption among the professional photographers of the time was slow, likely because they didn’t see an advantage of moving from wet plates and studio work. Only when Eastman started to market his film and cameras to the general public did photographic film become popular. Besides the invention of photographic film, Eastman was also instrumental in advancing photographic technology in the late 1800s and making it accessible to anyone who wanted to take pictures. He would later found the Eastman Kodak Company based on the success of his photographic film and camera technology.
Before the days of digital photography, film was inseparable from the art and skillset of photography for nearly 130 years.
And that’s your technology history for this week. For more, tune in next week and visit my website, thisdayintechhistory.com.
Resources
– https://thisdayintechhistory.com/10/13
– https://thisdayintechhistory.com/10/14
Katy Craig
OpenAI Considers Making Its Own Chips
OpenAI, a leading AI startup, is considering venturing into the development of its own AI chips to combat the ongoing chip shortage for training AI models.
This is Katy Craig in San Diego, California.
The reverse integration move aims to reduce dependency on GPU-based hardware, which has been strained by the generative AI boom.
CEO Sam Altman has prioritized acquiring more AI chips for the company, exploring strategies like acquiring an AI chip manufacturer for designing chips internally. Currently, OpenAI relies on GPUs for developing models such as ChatGPT and GPT-4, but the GPU supply chain is under immense pressure. The company’s reliance on GPUs in the cloud also comes at a high cost.
Creating custom AI chips is not unprecedented, with tech giants like Google, Amazon, and Microsoft already pursuing similar endeavors.
OpenAI, backed by significant funding and nearing a billion dollars in annual revenue, has the resources for such a venture. However, the hardware business can be challenging and expensive.
It remains to be seen if the company’s investors are willing to take on the risks associated with developing custom AI chips in a highly competitive market.
This is Katy Craig. Stay safe out there.
Resources
– Tech Crunch: https://techcrunch.com/2023/10/06/openai-said-to-be-considering-developing-its-own-ai-chips/
– Engaget: https://www.engadget.com/the-morning-after-chatgpt-creator-openai-might-start-making-its-own-ai-chips-111521672.html
– Reuters: https://www.reuters.com/technology/chatgpt-owner-openai-is-exploring-making-its-own-ai-chips-sources-2023-10-06/
Shannon Lietz
Hold on tight. ChatGPT’s maker OpenAI is thinking about making AI chips
All right, time to hold on tight. ChatGPT’s maker, OpenAI, is thinking about getting into the AI chip race.
Hi, this is Shannon Leitz reporting from San Diego, California.
So while it might surprise many of you to see OpenAI talking about hardware, it really doesn’t surprise me at all. And here’s why. During the public cloud boom, we’ve seen many providers actually talking about having to do hardware manufacturing to keep up with their demand for these new software capabilities.
And while we thought that the world was going to get eaten by software, I actually think it might be the mercy of the full stack, that we are going to see AI actually eat the world. But ultimately it is the full stack providers that are going to have their way with the world. And I think our future bets are going to be made on companies that have both software and hardware manufacturing unified.
You know, when the public cloud boom took hold, it was really interesting to see. At the time we had lots of chip manufacturers and server manufacturing and lots of separation of business, but consolidation seems to be the new key to the future.
With such shortfalls in AI chips and significant demand, the market is ready to generate new mega players. In fact, I predict that AI is actually going to be one of the competition makers.
OpenAI and its competitors have to consider the same option that they’re going to have to bring in hardware manufacturing or that their capabilities are going to be that much more expensive in terms of competition .
If you look at this, hardware and chip manufacturing are extremely challenging and we’ve seen businesses come and go related to these. But what’s really interesting is that for new capabilities like OpenAI and other yet to be seen technologies to become part of the forefront, we’ll need to see these stacks come together to create full benefit and achieve resilience .
And probably more importantly, the existing AI chip manufacturers that are out there that have had layoffs recently and have had a hard time fighting against other companies to become more relevant, such as NVIDIA. I think those smaller players, while they’re laying off currently, actually may become more of an acquisition interest to these software houses that require the hardware capabilities to keep up with the software capabilities and their futures.
That’s going to create a true challenge for the AI supply chain.
One of the other things to think about as we’re all here talking about cybersecurity, this situation actually demonstrates to me the challenges that , AI brings for adversaries and what they’ll face.
In fact, mounting AI based attacks and using AI, we all talk about the future of this, but I think it’s gonna come at such an extreme cost that actually, well, somebody can buy cheap AI capabilities. Are they really cheap enough for those AI attacks that are going to get truly the impact necessary to make them worthwhile or to bear the benefit for adversaries.
This essentially means that we’re going to see AI be the beginning of the reunification of hardware and software. And ultimately, where I see cybersecurity getting built in is going to be in these mega players, where cybersecurity capabilities will be part of the features provided of these platforms, and ultimately that the companies using these capabilities will need to actually keep track of these features and how they get configured for true resilience.
It’s something for us to all consider and how we’re going to continue to think about how we trust software for our future.
Resources
– Reuters: https://www.reuters.com/technology/chatgpt-owner-openai-is-exploring-making-its-own-ai-chips-sources-2023-10-06/
Olimpiu Pop
Is OpenAI the Next Google?
OpenAI’s ChatGPT has been a reality for almost a year now, and with its launch, the AI hysteria started. Everything ending in “. ai” is just being adopted, glorified, embraced.
Another reality is the huge amount of energy that it consumes. On a monetary scale, each query costs approximately 4 cents. On a green scale, a bottle of fresh freshwater is used for each 20 to 50 queries executed. Some consider it to be the next Google. It’ll not be a search engine, but a finder. It’ll do the work for you, but what does that actually mean? How should their infrastructure scale?
An analysis considers that they would need $ 48 billion worth in GPU chips and another $16 billion per year in maintenance costs. That’s quite a pile of money, even for a company with a sack of gold. For this reason, also for the shortage of the GPU chips, OpenAI considers building their own.
Some do it as well as Google’s training processing units, or Amazon Athena, but failed. Some failed, like Meta. What will OpenAI do? Especially as they are dependent on Microsoft’s infrastructure, a company that is also their main shareholder? Does it mean that they are getting a cold shoulder?
For now, there are only questions. The time will provide the answers. But one thing is clear- the world we live in, is the craziest ever.
505updates.com has more points of view related to cybersecurity and open-source.
Olimpiu Pop, reported from Transylvania, Romania.
Resources
– Euro News: https://www.euronews.com/green/2023/04/20/chatgpt-drinks-a-bottle-of-fresh-water-for-every-20-to-50-questions-we-ask-study-warns
– Tech Crunch: https://techcrunch.com/2023/10/06/openai-said-to-be-considering-developing-its-own-ai-chips/
– Reuters: https://www.reuters.com/technology/chatgpt-owner-openai-is-exploring-making-its-own-ai-chips-sources-2023-10-06/