Cyber criminals continue to innovate

Cyber criminals continue to innovate

Most people who work in an office, with a work email account, probably consider themselves able to spot a phishing email. In fact, most of us are either too smart – or too suspicious – to fall for an email asking to verify our bank details or ‘unlock’ an account to fix a problem. It is perhaps because of this that cyber criminals have stepped up their game.

Leading experts now believe that organisations are not innovating as quickly as cyber criminals. This leaves them open to attack. Continuous cyber security development is of the utmost importance if businesses hope to defend themselves from cyber criminals who are constantly refining their tactics.

This isn’t to say that there’s no great work being done for the side of good. There’s some groundbreaking innovation happening with cyber security companies putting in time and effort as well as working with customers to identify real cases and solutions. Cyber criminals, however, have huge amounts of time and money expendable to them as well as amazing technical ability.

What new tricks are cyber criminals using?

Some of you might be familiar with the term ‘deepfakes’. For those of you who aren’t, deepfakes are videos created using ‘deep-learning’ AI to make it look as though a person appeared in a video, doing something or saying something they didn’t. The theory being that if you had a video of one person in a fight, for example, you could replace the aggressor’s face with that of a public figure or celebrity to make it appear as though it was them. If you’d like to learn a little more about deepfakes there’s a YouTube video here that explains it pretty well.

Although, there are no high profile cases of this being used yet, these kinds of videos have the potential to be used to damage someone’s reputation or even be used for extortion. Cyber criminals have used similar AI, however, to scam a British energy firm out of hundreds of thousands of pounds.

The con artists had used commercial voice generating AI to mimic the CEO’s voice and request a bank transfer via a Hungarian supplier. The money, however, never arrived in Hungary and was instead moved to Mexico and various other locations. When being called about a bank transfer, always ask to call them back and do so from the number you have in your address book, rather than hitting redial.

It is predicted that we will see a huge rise in these kinds of scams in the coming future. While currently deepfakes require up to 17 hours of footage to create a realistic fake, voice mimicking can be done far easier.

Share this post

Further reading