Newsletter

open source and cybersecurity news

November 3, 2023

It's 5:05, November 3, 2023. TIme for your cybersecurity and open source headlines

In this Episode:

Marcel Brown: November 3rd, 1957. The Soviet Union launches Sputnik the second spacecraft launched into Earth orbit and the first spacecraft to carry a living creature into orbit. Laika, the Siberian Husky dog, unfortunately only survived a few hours into the flight and died from stress and overheating.

Edwin Kwan: Who should bear the cost of invoice scam? The victim, the company the money was meant to be sent to, or the bank? A couple tried to purchase a Mercedes-Benz from a dealership, but transferred the money to hackers due to an invoice scam. Mercedes-Benz is claiming that the invoice scam was due to the customer’s email being compromised

Olimpiu Pop: DORA Metrics became part of the silver bullets arsenal of the software industry. Follow the key metrics and all is well, right? Follow deployment frequency, time to restore the service, lead time for a change, and change failure rate and you’re all set. Not really. It’s much more than that.

Shannon Lietz: This year, what I saw that was most remarkable in the report was the AI section. There’s some interesting insights to glean from that section of the report. In particular, what folks are thinking about in terms of AI contributions. Top three was quite insightful if you ask me. Analyzing data, writing code clocks or data functions, and analyzing security.

Nathen Harvey: Back in January of 2023, AI was certainly hot, but how do we assess its impact on things like software delivery performance and organizational performance. This was a thing that we as researchers really struggled with. So we asked this question… ” for the primary application or service that you work on, how important is the role of AI in contributing to each of the following tasks?”

 

The Stories Behind the Cybersecurity Headlines

 

Edwin Kwan
Who Should Bear the Cost of Invoice Scam?

Edwin Kwan, Contributing Journalist, It's 5:05 PodcastWho should bear the cost of invoice scam? The victim, the company the money was meant to be sent to, or the bank?

This is Edwin Kwan from Sydney, Australia.

Invoice scamming is on the rise in Australia, and between January to September this year, over 28,000 false billing scams were reported to the Australian Competition and Consumer Commission’s Scamwatch with the loss of approximately $23 million. This is a 52% increase compared to the same period last year.

A couple tried to purchase a Mercedes-Benz from a dealership, but transferred the money to hackers due to an invoice scam. The attackers had intercepted the invoice from Mercedes Benz and they altered the attachment to change the bank details and sent it from a disguised email address.

The couple then made four bank transfers over a few days to the attacker’s account and emailed receipts of the transfer to Mercedes -Benz. It was only five days after the last transfer that a Mercedes employee alerted the couple that the money was not received.

The couple is now suing the dealership over the missing funds. They are arguing that if they had been told the first transfer was to an incorrect account, they would have taken steps to recover the payment. By the time they contacted the bank into which the funds were wrongfully transferred, the funds had already been withdrawn by the hackers.

Mercedes-Benz is claiming that the invoice scam was due to the customer’s email being compromised, and that the customer did not take steps to verify that the bank account details were correct before making the transfer and they had also failed to take reasonable steps to confirm that the money was received.

The company is wanting the couple to pay again for their new car. The case is still ongoing.

Resources
– The Age: https://www.theage.com.au/national/victoria/they-spent-139-000-on-a-new-car-but-got-scammed-mercedes-wants-them-to-pay-again-20231026-p5efbe.html
– The Age: https://www.theage.com.au/national/victoria/they-thought-they-were-sending-139-000-for-a-new-luxury-suv-instead-it-went-to-hackers-20230908-p5e341.html

 

Marcel Brown
This Day, November 3, in Tech History

Marcel Brown, Contributing Journalist, It's 5:05 PodcastThis is Marcel Brown bringing you some technology history for November 3rd and 4th.

November 3rd, 1957. The Soviet Union launches Sputnik the second spacecraft launched into Earth orbit and the first spacecraft to carry a living creature into orbit. Laika, the Siberian Husky dog, unfortunately only survived a few hours into the flight and died from stress and overheating. The Soviets had planned to euthanize Laika regardless, as Sputnik 2 did not have de-orbiting or re-entry technology designed into the spacecraft.

November 4th, 1952. As part of a publicity stunt to help boost sales, Remington Rand collaborates with CBS to have its Univac computer predict the results of the 1952 US presidential election between Dwight Eisenhower and Adlai Stevenson live on air. Because pre-election polls had the election very close, the publicity surrounding a computer predicting the winner generated a lot of popular interest. Univac correctly predicted a landslide victory by Eisenhower early in the evening after only 3 million votes had been returned and entered into the system. However, because it was so different than the expected result, the decision was made to hold back the computerized prediction. It appears that both Remington Rand and CBS feared the computer was incorrect, so they didn’t want to take the risk. It was only late in the broadcast, when continuing returns seemed to indicate that Univac was in fact correct, that CBS announced the landslide prediction from the computer, and the fact that it had done so hours earlier.

In the end, Univac had come within 3. 5 percent of the popular vote, within four electoral votes, and predicted 100 to 1 odds of the Eisenhower victory. The publicity stunt worked as UNIVAC became relatively famous. For a time, people started calling all computers UNIVACs. It was featured on the cover of a Superman comic book and in a Looney Tunes cartoon. By the next presidential election, four years later, all three major networks were using computers to predict the results.

November 4th, 1982, Compaq announces their Compaq Portable PC, one of the early portable computer designs and, more significantly, the first successful IBM compatible PC clone. Compaq eventually succeeded where other similar companies failed because they took considerable care in creating their product on two fronts.

First, they created the first 100 percent IBM compatible BIOS, the only proprietary component of the IBM PC, spending $1 million to reverse engineer the IBM BIOS using clean room techniques. This also allowed them to avoid copyright infringement charges. Finally, they were legally and financially prepared for the inevitable lawsuit IBM would bring against them, which was dismissed as expected.

By proving that a clean room, reverse-engineered BIOS could create 100 percent IBM compatible computers and withstand legal challenges from IBM, Compaq paved the way for the flood of IBM compatible clones that would begin in the mid 1980s. This was the opening of the Pandora’s box that led to IBM losing control of the platform and the emergence of Microsoft and Intel as the dominant technology companies of the PC era.

Even though IBM lost control of the platform they created, the weight of the IBM name, combined with the eventual low cost of the IBM-compatible platform, crushed nearly all other competing personal computer platforms of the era.

That’s your technology history for this week. For more, tune in next week and visit my website, ThisDayInTechHistory.com.

Resources
https://thisdayintechhistory.com/11/03

 

Introduction to Point of View Friday

This is Mark Miller, Executive Producer of 5:05. Today is Point of view Friday where we end each week with multiple perspectives of one of the issues of the day. This week Olimpiu Pop and Shannon Lietz give us their opinion on the recent DORA report, released by the team at Google.

We’ll start our coverage with one of the report authors, Nathen Harvey, explaining how and why AI has significant exposure in this year’s report. One of the things that we do is, each year, towards the beginning of the year, January-ish, we come up with questions, things that we know, or we suspect, are important to the industry. So of course, AI is one of those things that came up.

 

Nathen Harvey
Insights on AI in the DORA Report

Nathen Harvey, DORA Report Contributor, It's 5:05Nathan Harvey: Back in January of 2023, AI was certainly hot, but how do we assess its impact on things like software delivery performance and organizational performance. This was a thing that we as researchers really struggled with. So we asked this question… ” for the primary application or service that you work on, how important is the role of AI in contributing to each of the following tasks?” And as you can see on the report there, we list about 20 different tasks, one of which is is analyzing security.

What we see is that slightly more than 60 percent of our respondents say it’s at least somewhat important, with just over 30 percent saying that it’s extremely important.

What does this really tell us? Well, I guess another thing we have to remember is that we closed the data collection in July. And Mark, you and I both know that from July until, I don’t know, what is it, it’s, it’s the beginning of November already. How does that happen? Things have changed a lot. AI is, is so fast paced right now.

It’s interesting to understand that bit of it, like how we collect the data, how we report on it, this is truly a snapshot.

The other thing that’s important to know is uh, what did we actually find? We see some numbers of people saying it’s important. But we also find that for the teams that are using it, there is some moderate improvement in the well being of the engineers on those teams.

This makes a lot of sense to me, just thinking about it. Um, I’m an engineer, I want to play with the new stuff. AI is the new hotness. If I can get my hands on that, I’m going to be more excited about the work that I’m doing, maybe more engaged with it.

Only 1 percent of our survey respondents identified themselves as being in information security. So I think that we, we’re only seeing a brief glimmer of how this is impacting security professionals in particular, and sort of where, where they’re using it, where they’re headed.

Where should you use it? Where can you start experimenting? The truth is what we also see in our data is that AI, those teams that are using it, they aren’t actually seeing any real positive impact on software delivery performance, on organizational performance. It’s just too early. Maybe that’s a good place for us to start experimenting and build that trust and confidence in the tooling.

Mark Miller: Some people internally start playing with a little project, obvious things that this new technology can do. Is the ability to analyze for vulnerabilities, analyze systems, analyze infrastructures, is that the first game we’re going to play with this?

Nathen Harvey: That’s a good question. I think it really is context specific.

One of the other things that we found in our data was that teams with faster code reviews have a 50 percent better software delivery performance. If you’re working on a team where code reviews are taking some time, maybe that’s your bottleneck. And that’s a better place to start with some AI experiments.

The approach of DORA is to help you identify where is your bottleneck in achieving the outcomes that you want and run an experiment where that bottleneck is to try to alleviate that bottleneck . And of course, once you alleviate that bottleneck, don’t worry, there’s another bottleneck behind it.

Get into that, that mindset and that practice, truly, of continuous improvement. That’s what DORA recommends. That’s how we approach, uh, improving.

Mark Miller: Thank you, Nathan. As always, wonderful report. Thanks for all the stuff you guys are doing.

Nathen Harvey: Absolutely. Thanks so much, Mark. It’s good to see you.

Resources
– Google: https://cloud.google.com/blog/products/devops-sre/announcing-the-2023-state-of-devops-report
– DORA Report: https://cloud.google.com/devops/state-of-devops/

 

Shannon Lietz
Security in the DORA Report

Shannon Lietz, Contributing JournalistMark Miller: Shannon, we’re looking now at the State of DevOps report, the old DORA style report. What is one thing that stood out for you from this report?

Shannon Lietz: This year, what I saw that was most remarkable in the report was the AI section. There’s some interesting insights to glean from that section of the report. In particular, what folks are thinking about in terms of AI contributions.

Top three was quite insightful if you ask me. Analyzing data, writing code clocks or data functions, and analyzing security. Coming from that security practitioner side, I’m glad to see it there. That’s awesome. But it also suggests to me that, more of the security guidance is being moved, if you will, into AI . And the question is like, how ready is it to be adversarial aware and able to actually give the right guidance to somebody who’s looking for that?

There’s opportunity there for sure. The fact that the respondents are really looking for that capability tells me it’s going to help with knocking down the language barrier of dev and security, so that could be interesting as well.

This is a key part of this report this year.

Mark Miller: It was interesting to me that the general outcomes have been consistent year over year. Not that much has changed, in the foundations of the report. But as you say, the AI aspect of this is brand new to the report as it should be.

As we’re looking at the chart itself here, is there any one of these, I’m saying 20 data points that they’re looking at that stood out for you that said this is something people should be concentrating on?

Shannon Lietz: Yeah, honestly, I was excited to see many of these. I always am curious about artificial intelligence and what people might think about doing with it. In particular, I know it does do very well for analyzing logs and monitoring, and that’s square within the report.

Another one, too, to pick up on, Mark, is writing documentation. I think that’s a fabulous use of AI, particularly GenAI. How do you actually get your code to be well documented finally and to make it so it’s accessible. This could be an absolutely perfect way to use it.

One of the things that’s a little disappointing about this report is the fact that there’s only three mentions of security within the report. I feel like that’s still a little bit of a lag when it comes to building safer software sooner. We need to really focus on how can we make security easier, faster and more delightful.

Mark Miller: I’m going to take a screen cap of this and put it into the transcript of the recording so that people can look at this and make their own opinions on it. There will be a direct link to the report itself. As always, thanks, Shannon, and we’ll catch up with you next week.

Shannon Lietz: Thanks so much.

Resources
– Google: https://cloud.google.com/blog/products/devops-sre/announcing-the-2023-state-of-devops-report
– DORA Report: https://cloud.google.com/devops/state-of-devops/

 

Olimpiu Pop
DORA Metrics: State of DevOps Report

Olimpiu Pop, Contributing JournalistEver since I can remember, I have been a productivity addict of some sort, always trying to optimize, aiming to get to the best possible flow. Since stepping into the technology management space, this translated into fluid flows, optimal work for the teams, and semi-automated processes. But always, the focus was on the people.

I can’t even imagine hiding behind processes or blaming the process for a missed delivery or any other miss. Or just collapsing some tools on somebody- just because I say so.

DORA Metrics became part of the silver bullets arsenal of the software industry. Follow the key metrics and all is well, right? Follow deployment frequency, time to restore the service, lead time for a change, and change failure rate and you’re all set.

Not really. It’s much more than that. People will optimize for the metrics you follow. Is velocity a mark of a successful team? We will estimate more. Done. Better product, better team!

The key takeaways from the DORA report underline what I already knew and felt.

-Agility is the way. Plan. Execute. Measure, retrospect, adapt, and repeat. Always evolve.

– A safe culture will empower individuals to open their wings and fly.

– A dapt the process to the need, not the need to the process.

In plain terms, use something for as long as it works. When it stops delivering the expected value, adapt it or remove it. Metrics? Yes, they are useful, but make them count for the teams themselves, don’t use them to compare teams between themselves, you’ll get nowhere.

There is a whole world of wisdom hidden in the report, listen to the different takes on it on 505updates.com,

Olimpiu Pop, reported from Transylvania, Romania.

Resources
– Google: https://cloud.google.com/blog/products/devops-sre/announcing-the-2023-state-of-devops-report
– DORA Report: https://cloud.google.com/devops/state-of-devops/

Contributors:

Comments:

Newsletter