Recently, there was a poorly written article in The New Straits Times, that suggested the Malaysian Police would know if you were watching porn online. Let me cut to the chase, the article is shit. The software in question, aptly…
The domain cost me a whooping $18.00/yr on AWS, and involved a couple hours of registration and migration.
So I felt that while migrating domains, I might as well implement proper security headers as well. Security Headers are HTTP Headers that instruct the browser to deny or allow certain things, the idea being the more information the site tells the browser about itself, the less susceptible it is to attack.
I was shocked to find out that Gov-TLS-Audit had no security headers at all! I assumed AWS (specifically CloudFront) would take care of ‘some’ http headers for me — I was mistaken. Cloudfront takes care of the TLS implementation, but does not implement any security header for you, not even
strict-transport-security which is TLS related.
So unsurprisingly, a newly created cloudfront distribution, using the reference AWS implementation, fails miserably when it comes to security headers.
I guess the reason is that HTTP headers are very site-dependant. Had Cloudfront done it automatically, it might have broken a majority of sites And implementing headers is one thing, but fixing the underlying problem is another — totally bigger problem.
But what security headers to implement?
Last week, MyNic suffered a massive outage taking out any website that had a
.my domain, including local banks like maybank2u.com.my and even government websites hosted on
Here’s a great report on what happened from IANIX. I’m no DNSSEC expert, but here’s my laymen reading of what happened:
- Up to 11-Jun,
.myused a DNSKEY with
- For some reason, this key went missing on the 15-Jun, and was replaced with DNSKEY
key tag:63366. Which is still a valid SEP for
- Unfortunately, the DS record on root, was still pointing to
- So DNSSEC starting failing
- 15 hours later, instead of correcting the error, someone tried to switch off DNSSEC removing all the signatures (RRSIG)
- But this didn’t work, as the parent zone still had a DS entry that pointed to
key tag:25992and hence was still expecting DNSSEC to be turned on.
- 5 hours after that, they added back the missing DNSKEY
key tag:25992(oh we found it!), but added invalid Signatures for all entries — still failing.
- Only 4 hours after that did they fix it, with the proper DS entry on root for DNSKEY
key tag:63366and valid signatures.
- That’s a 24 hour outage on all
So basically, something broke, they sat on it for 15 hours, then tried a fix, didn’t work. Tried something else 5 hours after that, didn’t work again! And finally after presumably a lot of praying to the Gods of the Internet and a couple animal sacrifices, managed to fix it after a 24-hour downtime.
I defend my fellow IT practitioners a lot on this blog, but this is a difficult one. Clearly this was the work of someone who didn’t know what they were doing, and refused to ask for help, instead tried one failed fix after another which made things worse. As my good friend Mark Twain would say — it’s like a Mouse trying to fix a pumpkin.
I don’t fully understand DNSSEC (it’s complicated), but I’m not in charge of a TLD. It’s unacceptable that someone could screw up this badly — and for that screw up to impact so many people, and all we got was a lousy press release.
The point is, it shouldn’t take 24 hours to resolve a DNSSEC issue, especially when it’s such a critical piece of infrastructure. I’ve gone through reports of similar DNSSEC failures, and in most cases recovery takes 1-5 hours. The
.nasa.gov TLD had a similar issue, that was resolved in an hour, very rarely do we see a 24 hour outage, so what gives?
I look forward to an official report from MyNIC to our spanking new communications ministry, and for that to be shared to the public.
At around 11pm last Friday, I got a query from Zurairi at The Malay Mail, asking for a second opinion on a strange email the newsdesk received from an ‘anonymous source’. The email was regular vulnerability disclosure, but one that was full of details, attached with an enormous amount of data.
This wasn’t a two-liner tweet, this was a detailed email with outlined sub-sections. It covered why they were sending the email, what the vulnerable system was, how to exploit the vulnerability and finally (and most importantly!) a link to a Google Drive folder containing Gigabytes of data.
The email pointed to a Ministry of Education site called SAPSNKRA, used for parents to check on their children’s exam results. Quick Google searches reveal the site had security issues in the past including one blog site advising parents to proceed past the invalid certificate warning in firefox. But let’s get back to the breach.
My first reaction was to test the vulnerability, and sure enough, the site was vulnerable to SQL Injection, in exactly the manner specified by the email. So far email looked legitimate.
Next, I verified the data in the Google Drive folder, by downloading the gigabytes of text files, and checking the IC Numbers of children I knew.
I further cross-checked a few parents IC numbers against the electoral roll. Most children have some indicator of their fathers name embedded in their own, either through a surname or the full name of the father after the bin, binti, a/l or a/p. By keying in the fathers IC number, and cross-referencing the fathers name against what was in the breach, it was easy to see that the data was the real deal.
So I called back Zurairi and confirmed to him that the data was real, and that the site should be taken offline. I also contacted a buddy of mine over at MKN, to see if he could help, and Zurairi had independently raised a ticket with MyCert (a ticket??!!) and tried to contact the Education Minister via his aide.
Obviously neither Zurairi nor myself, or any of the other journalist I kept in touch with, could report on the story. The site was still vulnerable, and we didn’t want someone else breaching it.
The next morning, I emailed the anonymous source and asked them to take down the Google Drive, explaining that the breach was confirmed, and people were working to take down the site. Hence there was no reason to continue exposing all of that personal information on the internet.
They agreed, and wiped the drive clean, and shortly after I got confirmation that the SAPSNKRA website had been taken down. So with the site down, and the Google Drive wiped cleaned, it seemed the worst was behind us.
Danger averted…at least for now.
Couple months back I started GovTLSAudit. A simple service that would scan
.gov.my domains, and report on their implementation of TLS. But the service seems to have benefits above and beyond that, specifically around having a list of a government sites that we can use to cross-check against other intel sources like Shodan (which we already do daily) and VirusTotal.
So here’s 3 times GovTLSAudit helped secure government websites.
That time Yayasan Islam Terengganu was used a phishing website
I used virustotal’s search engine to see if they had extra .gov.my domains to scan, and found a few rather suspicious looking urls including:
This was an obvious phishing campaign being run out of a
.gov.my domain. Digging further, I found that the IP address the malicious urls resolve to was local, and belonged to Exabytes. And while the root page was a bare apache directory, buried deep within the sites sub-directories was a redirect that pointed to a Russian IP.
I took to twitter to report my findings — I kinda like twitter for this, and the very next day Exabytes come back with a followup that they were fixing it. That’s good, because having a phishing campaign run on
.gov.my infrastructure isn’t exactly what you’d like.
There’s a lot more details in the tweet about how I investigated this,– click here to follow the thread. A warning though — I regularly delete my old tweets. So get it while it’s there :).
If you’ve come here from a link on twitter — you’d see that the address bar still says login.astro.com.my, but the site is rendering this page from my blog. If not, click this link to see what I mean. You’ll…
In 2015, I was invited to a variety program on Astro to talk about cybersecurity. This was just after Malaysian Airlines (MAS) had their DNS hijacked, but I was specifically told by the producer that I could NOT talk about…
Last Month, I embarked on a new project called GovTLS Audit, a simple(ish) program that would scan 1000+ government websites to check for their TLS implementation. The code would go through a list of hostnames, and scan each host for TLS implementation details like redirection properties, certificate details, http headers, even stiching together Shodan results into a single comprehensive data record. That record would inserted into a DynamoDB, and exposed via a rest endpoint.
Initially I ran the scans manually Sunday night, and then uploaded the output files to S3 Buckets, and ran the scripts to insert them into the DB.
But 2 weeks ago, I decided to Automate the Process, and the architecture of this simple project is complete(ish!). Nothing is ever complete, but this is a good checkpoint, for me to begin documenting the architecture of GovTLS Audit (sometimes called siteaudit), and for me to share.
What is GovTLS Audit
First let’s talk about what GovTLS Audit is — it’s a Python Script that scans a list of sites on the internet, and stores the results in 3 different files, a CSV file (for human consumption), a JSONL file (for insertion into DynamoDB) and a JSON file (for other programmatic access).
A different script then reads in the JSONL file and loads each row into database (DynamoDB), and then uploads the 3 files as one zip to an S3 bucket.
On the ‘server-side’ there are 3 lambda functions, all connected to an API Gateway Endpoint Resource.
- One that Queries the latest details for a site [/siteDetails]
- One that Queries the historical summaries for the site [/siteHistory]
- One that List all scan (zip files) in the S3 Bucket [/listScans]
Last week I launched a draft of the Gov.my Audit, and this week we have version 2.0
Here’s what changed:
- Added More Sites. We now scan a total of 1324 government websites, up from just 1180.
- Added Shodan Results. Results includes both the open ports and time of the Shodan scan (scary shit!)
- Added Site Title. Results now include the HTML title to give a better description of the site (hopefully!).
- Added Form Fields. If the page on the root directory has an input form, the names of the fields will appear in the results. This allows for a quick glance at which sites have forms, and (roughly!) what the form ask for (search vs. IC Numbers).
- Added Domain in the CSV. The CSV is sorted by hostname, to allow for grouping by domain names (e.g. view all sites from selangor.gov.my or perlis.gov.my)
- Added an API. Now you can query the API can get more info on the site, including the cert info and HTTP headers.
- Released the Serverless.yml files for you to build the API yourself as well 🙂
All in all, it’s a pretty bad-ass project (if I do say so myself). So let’s take all that one at a time.
I keep this blog to help me think, and over the past week, the only thing I’ve been thinking about, was sayakenahack.
I’ve declined a dozen interviews, partly because I was afraid to talk about it, and partly because my thoughts weren’t in the right place. I needed time to re-group, re-think, and ponder.
This blog post is the outcome of that ‘reflective’ period.
The PR folks tell me to strike while the iron is hot, but you know — biar lambat asal selamat.
Why I started sayakenahack?
I’m one part geek and one part engineer. I see a problem and my mind races to build a solution. Building sayakenahack, while difficult, and sometimes frustrating, was super-duper fun. I don’t regret it for a moment, regardless of the sleepless nights it has caused me.
But that’s not the only reason.
I also built it to give Malaysians a chance to check whether they’ve been breached. I believe this is your right, and no one should withhold it from you. I also know that most Malaysians have no chance of ever checking the breach data themselves because they lack the necessary skills.
I know this, because 400,000 users have visited my post on “How to change your Unifi Password“.
If they need my help to change a Wifi password, they’ve got no chance of finding the hacker forums, downloading the data, fixing the corrupted zip, and then searching for their details in file that is 10 million rows long — and no, Excel won’t fit 10mln rows.
So for at least 400,000 Malaysians, most of whom would have had their data leaked, there would have been zero chance of them ever finding out. ZERO!
The ‘normal’ world is highly tech-illiterate (I’ve even talked about it on BFM). Sayakenahack was my attempt to make this accessible to common folks. To deny them this right of checking their data is just wrong.
But why tell them at all if there’s nothing they can do about it? You can’t put the genie back in the lamp.