Dec 282017
 

These are the noteworthy stories, in no particular order, that peaked my interest for this week.

Privacy group accuses Hotspot Shield of snooping on web traffic


Besides who is right in this, the VPN provider or the privacy group, I must admit I am more inclined to believe the latter, the true take-away of this article is indeed the fact that VPN services will not entirely protect your browsing habits and certainly may not make you anonymous either.
The only thing it will do is move the burden of trust from your Internet service provider to the VPN provider, even more so as most of them have their own dedicated apps that you need to install to use their service as opposed to your ISP who doesn’t have that. This means that VPN providers have even more opportunity to spy on you then your ISP.
So yes, the real question if you actually do trust your VPN provider is extremely relevant. Even more she for those using VPN services because they really need to protect themselves for governments and censorship instead of a mere convenience and protection on public WiFi networks.
source: ZDNet (external link)

Mozilla’s new file-transfer service isn’t perfect, but it’s drop-dead easy


Besides all the security issues with doing these kind of operations in the browser, the generated link contains the decryption key. This simply means that indeed anyone who knows the link will be able to download and decrypt the file.
The question is, who is sending that link to the intended recipient of the data? If the user of the service is doing that themselves then probably insecure E-mail is used, if mozilla is doing it it probably will use E-mail as well. Besides the fact that Mozilla now has the download link and therefore the decryption key too.
Another method for Mozilla to obtain that key is from the download request to their website to download the file. The link and associated information may end up in the webserver logs which also will reveal the decryption key to the organisation. If data is actually not really deleted, security and privacy are breached this way and nobody will know it (until it will leak obviously).
I am not saying that Mozilla is after the data or is actually not taking steps to make sure they can’t obtain the decryption keys in either way, I only state that it is a possibility. The system is weak, but may have some benefits over direct e-mail or other services like WeTransfer.
source: Ars Technica (external link)

Top Security Firm May Be Leaking ‘Terabytes’ of Confidential Data From Fortune 100 Companies


So apparently, despite the warnings, a lot of their clients enabled the sharing of data with cloud-based multiscanners because well they probably were not aware enough of the risks associated with that function.
My question for my readers: wouldn’t it be morally and ethically correct for such a sharing function not to be present in such security software at all? And if you want to offer it as a service, make it a payable add-on so companies really need to think long and hard before enabling such a sharing function?
And what does this say about those large companies enabling that function? Well at least that a lot of very sensitive data including access keys, credentials and more is available in relatively easy readable format, some of which should probably be better secured inside the company itself in the first place.
source: Gizmodo (external link)

Google Prepares For Europe’s New Privacy Rules


First of all, this article mixes up the roles Google actually has as a data controller where it itself offers services and that of a data processor where services are used by other companies who are controllers themselves.
Of that last category, probably more related to the Google services that are listed, Google has no rights under any European law (neither new nor present) to do anything with the data for their own benefit whatsoever, not even when given permission for this.
As for Google as data controller, the author states that the new law is their to make companies better protect the data, this is just a gross over-simplification of data protection laws in Europe and therefore is incorrect or at least massively incomplete.
Yeah google may not sell your data, their own usage and the reasons it is keeping so much of your personal data itself is reason enough to believe that they will never be able to comply to the GDPR. They already don’t comply to the current EU data protection regime, although that’s covered under the blanket called EU/US privacyshield.
source: mediaport.com (external link)

NotBeingPetya: UK critical infrastructure firms face huge fines for lax security


So much for brexit, okay she Directive on security of network and information systems (NIS Directive) comes into force in May 2018 almost together with the GDPR so the UK actually had to implement it. They actually did and put the fines for non-adherence to the same level, proposed, as those in the GDPR which means a maximum of 20 million or 4 percent of annual turnover whichever is higher.
Actually this directive should have been called SNIS if you are entirely correct and it talks about information systems and therefore is broader then cyber security. The EU itself claims that this is the first piece of EU wide cyber security legislation, although that’s only partially true as all the member states need implement this into their own national laws as well. This as opposed to EU regulations which are indeed EU wide legislation, bar some exceptions made in the GDPR though.
Whilst everybody is focussing all their attention on the GDPR, the NIS directive is sneaking into all the national legislation as well and may have far reaching impact on a lot more then just data protection.
source: The Register (external link)

Researchers report >4,000 apps that secretly record audio and steal logs


It is no wonder that the Finissh based security company F-Secure has reported over and over again that 99 percent of mobile malware is targeting the Android operating system.
This is another prime example of such malware, this time it seems an entire family of apps that have no other function in the end then spying and misusing information and devices.
Whilst Google did remove the app from it’s play store after it was reported as malicious, events like this are certainly not scares or far between. Isn’t it about time that Google starts following the example Apple has said and starts screening apps in it’s store beforehand?
Although that is not an entire guarantee that no malware will end up in the App Store at all, prove of concept malware for iOS has been moved passt Apple’s screening processes a couple of times too, it will certainly help mitigate some of the risk. Google now is fully relying on 3rd party security researchers and the watchful eyes of the Android users not to download suspicious apps to their devices, something we all know is not enough.
source: Ars Technica (external link)

“Pretty egregious” security flaw raises questions about Pacer


If a CRSF (cross-Side request forgery) vulnerability was present in the pacer websites for 22 years, everything could have been possible including illegal data access, changing data or forging document without permission and any other action a legitimate user had the rights to take.
The comment by the organisation, see the update at the end of the article, clearly enforces the notion that they are completely clueless on web security.
As for the claimed security tests by professionals, if I was pacer I would ask my money back. Unless the vulnerability was part of the reports and they themselves elected to ignore it, which wouldn’t surprise me either.
CSRF Vulnerabilities have been a long standing high scoring item in the OWASP top 10, not because they are not easy to fix but because people responsible for the web application don’t have the knowledge to find and fix them. This also holds for XSS (cross-side scripting) as well as SQL injection vulnerabilities.
The only way how to fix this in the long term is provide secure coding training for developers as well as website administrators and testers.
source: Ars Technica (external link)

Biometrics changing the way governments address data security, privacy


Biometrics increase security and privacy? Well maybe, but they may if implemented incorrectly decrease security and certainly privacy a hole lot faster then the benefits that are attributed to the technology.
Another thing that is wrong with this “research” is clearly that the company who wrote the report probably only asked people in organisations they are selling the technology to in corporation with Microsoft. I call this report nothing short of benchmarketing.
With the upcoming GDPR in which biometric data will become a special category, the mass adoption of the technology may be different then portrait in this article.
source: BiometricUpdate (external link)

Radio navigation set to make global return as GPS backup, because cyber


This is quite an interesting story, specifically because of the fact that GPS is easily spoonable, jammable or can simply be switched off by the US military.
The antenna that is being proposed though is a compromise at best, at the frequencies discussed in the article (90-110 Khz) the wavelengths are around the 3 kilometres in length. This means that a postage stamp antenna needs to compensate for this in a different manner.
Besides this all, most airlines still rely on radio beacons for guidance next to the GPS reception which is part of modern cockpit equipment but not as primary system.
Besides this, GPS receivers had a correction signal as well. If I remember correctly this was called WAAS and indeed functioned as a correction terrestrial beacon for GPS receivers.
source: Ars Technica (external link)

Sorry, the comment form is closed at this time.