Researchers at application security company Jscrambler have just published a cautionary tale about supply chain attacks…
…that is also a powerful reminder of just how long attack chains can be.
Sadly, that’s long merely in terms of time, not long in terms of technical complexity or the number of links in the chain itself.
Eight years ago…
The high-level version of the story published by the researchers is simply told, and it goes like this:
You can see where this story is going.
Any hapless former Cockpit users who had apparently not checked their logs properly (or perhaps even at all) since late 2014 failed to notice that they were still trying to load code that wasn’t working.
We’re guessing that those businesses did notice they weren’t getting any more analytics data from Cockpit, but that because they were expecting the data feed to stop working, they assumed that the end of the data was the end of their cybersecurity concerns relating to the service and its domain name.
Injection and surveillance
According to Jscrambler, the crooks who took over the defunct domain, and who thus acquired a direct route to insert malware into any web pages that still trusted and used that now-revived domain…
This enabled two major types of attack:
textareafields (such as you would expect in a typical web form) was extracted, encoded and exfiltrated to a range of “call home” servers operated by the attackers.
- Insert additional fields into web forms on selected web pages. This trick, known as HTML injection, means that crooks can subvert pages that users already trust. Users can believably be lured into entering personal data that those pages wouldn’t normally ask for, such as passwords, birthdays, phone numbers or payment card details.
With this pair of attack vectors at their disposal, the crooks could not only siphon off whatever you typed into a web form on a compromised web page, but also go after additional personally identifiable information (PII) that they wouldn’t normally be able to steal.
This sort of tailored response, which is easy to implement by looking at the
Referer: header sent in the HTTP requests generated by your browser, also makes it hard for cybersecurity rearchers to determine the full range of attack “payloads” that the criminals have up their sleeves.
After all, unless you know in advance the precise list of servers and URLs that the crooks are looking out for on their servers, you won’t be able to generate HTTP requests that shake loose all likely variants of the attack that the criminals have programmed into the system.
In case you’re wondering, the
Referer: header, which is a mis-spelling of the English word “referrer”, gets its name from a typographical mistake in the original internet standards document.
What to do?
- Review your web-based supply chain links. Anywhere that you rely on URLs provided by other people for data or code that you serve up as if it were your own, you need to check regularly and frequently that you can still trust them. Don’t wait for your own customers to complain that “something looks broken”. Firstly, that means you’re relying entirely on reactive cybersecurity measures. Secondly, there may not be anything obvious for customers themselves to notice and report.
- Check your logs. If your own website makes use of embedded HTTP links that are no longer working, then something is clearly wrong. Either you shouldn’t have been trusting that link before, because it was the wrong one, or you shouldn’t be trusting it any more, because it’s not behaving as it used to. If you aren’t going to check your logs, why bother collecting them in the first place?
…so, please, don’t be that person!
web-cockpit DOT jp, if you want to search your own logs) is blocked by Sophos as
SEC_MALWARE_REPOSITORY. This denotes that the domain is known not only to be associated with malware-related cybercriminality, but also to be involved in actively serving up malware code.
2 comments on “Credit card skimming – the long and winding road of supply chain failure”
> Review your web-based supply chain links.
Wish Epic Software had done this before shipping the Meta tracking bug to all their customers.
I am convinced that there is a new generation of developers who think development is about finding code fragments anywhere on the internet and uncritically pasting them into their work-product.
Who says they are “uncritical”? For all you know they might be highly critical when copying-and-pasting…
X. “I saved days by finding this algorithm on GutHib!”
Y. “Is the code any good?”
X. “No, it’s terrible, and I had to stand backwards on a chair and sing ‘Bohemian Rhapsody’ in a squeaky voice to get it to compile.”
Y. “Oh, well, at least you won’t need to compile it again now it’s shipped.”
/s (for the avoidance of doubt).