Apple will let you port Google Chrome extensions to Safari

Apple unveiled macOS 11 Big Sur earlier this week and talked about some of the improvements for Safari. In addition to native extensions, Apple is adding support for web extensions. It’s going to make it much easier to port an existing extension from Chrome, Firefox or Edge.

The company shared more details about how it’s going to work in a WWDC session. Safari already supports extensions, but if you’re using Safari, you know that there aren’t a ton of extensions out there.

On iOS and macOS, you can install content blockers and apps that feature a share extension. Content blockers let you provide a list of content to block when you load web pages, such as trackers and ads.

Share extensions let you add features in the share menu in Safari. For instance, Pocket or Instapaper take advantage of share extensions to run JavaScript on a web page and return the result to the app.

On macOS, developers can also take advantage of app extensions. 1Password uses that to integrate its password manager with Safari.

“These are great if you’re a native app developer already familiar with Swift or Objective-C,” Safari engineer Ellie Epskamp-Hunt said.

Other browsers have taken a different approach. They leverage web technologies, such as JavaScript, HTML and CSS. That’s why Apple is adding another type of extension with Safari Web Extensions.

Like other Safari extensions, web extensions designed for Safari are packaged with native apps. It means that developers will submit extensions to the App Store. Users will download an app that comes with an extension. The app doesn’t have to do anything, it can just be a place holder.

Apple is shipping an extension converter to let you port your extension quickly. When you run it, it’ll tell you if everything is going to work as expected. You can then package it in an Xcode project, sign it and submit it to the App Store.

Some extensions require a ton of permissions. They can essentially view all web pages you visit. That’s why Apple lets you restrict extensions to some websites, or just the active tab. You can also choose to activate an extension for a day so that it doesn’t remain active forever.

The user will get a warning sign the first time an extension tries to access a site and there will be a big warning banner in Safari settings before you activate an extension that can access all your browsing data.

This change could potentially mean that there will be a lot more extensions for Safari in the future. Many Chrome users don’t want to leave Chrome because they can’t find the same extensions. If developers choose to port their extensions to Safari, Apple could convince more users to switch to Safari.

Browsers are interesting again

A few years ago, covering browsers got boring.

Chrome had clearly won the desktop, the great JavaScript speed wars were over and Mozilla seemed more interested in building a mobile operating system than its browser. Microsoft tried its best to rescue Internet Explorer/Edge from being the punchline of nerdy jokes, but its efforts essentially failed.

Meanwhile, Opera had shuttered the development of its own rendering engine and redesigned its browser with less functionality, alienating many of its biggest fans. On mobile, plenty of niche players tried to break the Chrome/Safari duopoly, but while they did have some innovative ideas, nothing ever stuck.

But over the course of the last year or so, things changed. The main catalyst for this, I would argue, is that the major browser vendors — and we can argue about Google’s role here — realized that their products were at the forefront of a new online privacy movement. It’s the browser, after all, that allows marketers to set cookies and fingerprint your machine to track you across the web.

Add to that Microsoft’s move to the Chromium engine, which is finally giving Microsoft a seat at the browser table again, plus the success of upstarts like Brave and Vivaldi, and you’ve got the right mix of competitive pressure and customer interest for innovation to come back into what was a stagnant field only a few years ago.

Let’s talk about privacy first. With browsers being the first line of defense, it’s maybe surprising that we didn’t see Mozilla and others push for more built-in tracking protections before.

In 2019, the Chrome team introduced handling cookies in the browser and a few months ago, it launched a broader initiative to completely rethink cookies and online privacy for its users — and by extension, Google’s advertising ecosystem. This move centers around differential privacy and a ‘privacy budget’ that would allow advertisers to get enough information about you to group you into a larger cohort without providing so much information that you would love your anonymity.

At the time, Google said this was a multi-year effort that was meant to help publishers retain their advertising revenue (vs their users completely blocking cookies).

iOS 13: Here are the new security and privacy features you need to know

It’s finally here.

Apple’s new iOS 13, the thirteenth major iteration of its popular iPhone software, is out to download. We took iOS 13 for a spin with a focus on the new security and privacy features to see what’s new and how it all works.

Here’s what you need to know.

You’ll start to see reminders about apps that track your location

1 location track

Ever wonder which apps track your location? Wonder no more. iOS 13 periodically reminds you about apps that are tracking your location in the background. Every so often it will tell you how many times an app has tracked where you’ve been in a recent period of time, along with a small map of the location points. From this screen you can “always allow” the app to track your location or have the option to limit the tracking.

You can grant an app your location just once

2 location ask

To give you more control over what data have access to, iOS 13 now lets you give apps access to your location just once. Previously there was “always,” “never” or “while using,” meaning an app could be collecting your real-time location as you’re using it. Now you can grant an app access on a per use basis — particularly helpful for the privacy-minded folks.

And apps wanting access to Bluetooth can be declined access

Screen Shot 2019 07 18 at 12.18.38 PM

Apps wanting to access Bluetooth will also ask for your consent. Although apps can use Bluetooth to connect to gadgets, like fitness bands and watches, Bluetooth-enabled tracking devices known as beacons can be used to monitor your whereabouts. These beacons are found everywhere — from stores to shopping malls. They can grab your device’s unique Bluetooth identifier and track your physical location between places, building up a picture of where you go and what you do — often for targeting you with ads. Blocking Bluetooth connections from apps that clearly don’t need it will help protect your privacy.

Find My gets a new name — and offline tracking

5 find my

Find My, the new app name for locating your friends and lost devices, now comes with offline tracking. If you lost your laptop, you’d rely on its last Wi-Fi connected location. Now it broadcasts its location using Bluetooth, which is securely uploaded to Apple’s servers using nearby cellular-connected iPhones and other Apple devices. The location data is cryptographically scrambled and anonymized to prevent anyone other than the device owner — including Apple — from tracking your lost devices.

Your apps will no longer be able to snoop on your contacts’ notes

8 contact snoop

Another area that Apple is trying to button down is your contacts. Apps have to ask for your permission before they can access to your contacts. But in doing so they were also able to access the personal notes you wrote on each contact, like their home alarm code or a PIN number for phone banking, for example. Now, apps will no longer be able to see what’s in each “notes” field in a user’s contacts.

Sign In With Apple lets you use a fake relay email address

6 sign in

This is one of the cooler features coming soon — Apple’s new sign-in option allows users to sign in to apps and services with one tap, and without having to turn over any sensitive or private information. Any app that requires a sign-in option must use Sign In With Apple as an option. In doing so users can choose to share their email with the app maker, or choose a private “relay” email, which hides a user’s real email address so the app only sees a unique Apple-generated email instead. Apple says it doesn’t collect users’ data, making it a more privacy-minded solution. It works across all devices, including Android devices and websites.

You can silence unknown callers

4 block callers

Here’s one way you can cut down on disruptive spam calls: iOS 13 will let you send unknown callers straight to voicemail. This catches anyone who’s not in your contacts list will be considered an unknown caller.

You can strip location metadata from your photos

7 strip location

Every time you take a photo your iPhone stores the precise location of where the photo was taken as metadata in the photo file. But that can reveal sensitive or private locations — such as your home or office — if you share those photos on social media or other platforms, many of which don’t strip the data when they’re uploaded. Now you can. With a few taps, you can remove the location data from a photo before sharing it.

And Safari gets better anti-tracking features

9 safari improvements

Apple continues to advance its new anti-tracking technologies in its native Safari browser, like preventing cross-site tracking and browser fingerprinting. These features make it far more difficult for ads to track users across the web. iOS 13 has its cross-site tracking technology enabled by default so users are protected from the very beginning.

First published on July 19 and updated with iOS 13’s launch. 

Read more:

Web feature developers told to dial up attention on privacy and security

Web feature developers are being warned to step up attention to privacy and security as they design contributions.

Writing in a blog post about “evolving threats” to Internet users’ privacy and security, the W3C standards body’s technical architecture group (TAG) and Privacy Interest Group (PING) set out a series of revisions to the W3C’s Security and Privacy Questionnaire for web feature developers.

The questionnaire itself is not new. But the latest updates place greater emphasis on the need for contributors to assess and mitigate privacy impacts, with developers warned that “features may not be implemented if risks are found impossible or unsatisfactorily mitigated”.

In the blog post, independent researcher Lukasz Olejnik, currently serving as an invited expert at the W3C TAG; and Apple’s Jason Novak, representing the PING, write that the intent with the update is to make it “clear that feature developers should consider security and privacy early in the feature’s lifecycle” [emphasis theirs].

“The TAG will be carefully considering the security and privacy of a feature in their design reviews,” they further warn, adding: “A security and privacy considerations section of a specification is more than answers to the questionnaire.”

The revisions to the questionnaire include updates to the threat model and specific threats a specification author should consider — including a new high level type of threat dubbed “legitimate misuse“, where the document stipulates that: “When designing a specification with security and privacy in mind, all both use and misuse cases should be in scope.”

“Including this threat into the Security and Privacy Questionnaire is meant to highlight that just because a feature is possible does not mean that the feature should necessarily be developed, particularly if the benefitting audience is outnumbered by the adversely impacted audience, especially in the long term,” they write. “As a result, one mitigation for the privacy impact of a feature is for a user agent to drop the feature (or not implement it).”

Features should be secure and private by default and issues mitigated in their design,” they further emphasize. “User agents should not be afraid of undermining their users’ privacy by implementing new web standards or need to resort to breaking specifications in implementation to preserve user privacy.”

The pair also urge specification authors to avoid blanket treatment of first and third parties, suggesting: “Specification authors may want to consider first and third parties separately in their feature to protect user security and privacy.”

The revisions to the questionnaire come at a time when browser makers are dialling up their response to privacy threats — encouraged by rising public awareness of the risks posed by data leaks, as well as increased regulatory action on data protection.

Last month the open source WebKit browser engine (which underpins Apple’s Safari browser) announced a new tracking prevention policy that takes the strictest line yet on background and cross-site tracking, saying it would treat attempts to circumvent the policy as akin to hacking — essentially putting privacy protection on a par with security.

Earlier this month Mozilla also pushed out an update to its Firefox browser that enables an anti-tracking cookie feature across the board, for existing users too — demoting third party cookies to default junk.

Even Google’s Chrome browser has made some tentative steps towards enhancing privacy — announcing changes to how it handles cookies earlier this year. Though the adtech giant has studiously avoided flipping on privacy by default in Chrome where third party tracking cookies are concerned, leading to accusations that the move is mostly privacy-washing.

More recently Google announced a long term plan to involve its Chromium browser engine in developing a new open standard for privacy — sparking concerns it’s trying to both kick the can on privacy protection and muddy the waters by shaping and pushing self-interested definitions which align with its core data-mining business interests.

There’s more activity to consider too. Earlier this year another data-mining adtech giant, Facebook, made its first major API contribution to Google’s Chrome browser — which it also brought to the W3C Performance Working Group.

Facebook does not have its own browser, of course. Which means that authoring contributions to web technologies offers the company an alternative conduit to try to influence Internet architecture in its favor.

The W3C TAG’s latest move to focus minds on privacy and security by default is timely.

It chimes with a wider industry shift towards pro-actively defending user data, and should rule out any rubberstamping of tech giants contributions to Internet architecture which is obviously a good thing. Scrutiny remains the best defence against self-interest.

Mozilla flips the default switch on Firefox tracker cookie blocking

From today Firefox users who update to the latest version of the browser will find a pro-privacy setting flipped for them on desktop and Android smartphones, assuming they didn’t already have the anti-tracking cookie feature enabled.

Mozilla launched the Enhanced Tracking Protection (ETP) feature in June as a default setting for new users — but leaving existing Firefox users’ settings unchanged at that point.

It’s now finishing what it started by flipping the default switch across the board in v69.0 of the browser.

The feature takes clear aim at third party cookies that are used to track Internet users for creepy purposes such as ad profiling. (Firefox relies on the Disconnect list to identify creepy cookies to block.)

The anti-tracking feature also takes aim at cryptomining: A background practice which can drain CPU and battery power, negatively impacting the user experience. Again, Firefox will now block cryptomining by default, not only when user activated.

In a blog post about the latest release Mozilla says it represents a “milestone” that marks “a major step in our multi-year effort to bring stronger, usable privacy protections to everyone using Firefox”.

“Currently over 20% of Firefox users have Enhanced Tracking Protection on. With today’s release, we expect to provide protection for 100% of ours users by default,” it predicts, underlining the defining power of default settings.

Firefox users with ETP enabled will see a shield icon in the URL bar to denote the tracker blocking is working. Clicking on this icon takes users to a menu where they can view a list of all the tracking cookies that are being blocked. Users are also able to switch off tracking cookie blocking on a per site basis, via this Content Blocking menu.

While blocking tracking cookies reduces some tracking of internet users it does not offer complete protection for privacy. Mozilla notes that ETP does not yet block browser fingerprinting scripts from running by default, for example.

Browser fingerprinting is another prevalent privacy-hostile technique that’s used to track and profile web users without knowledge or consent by linking online activity to a computer’s configuration and thereby tying multiple browser sessions back to the same device-user.

It’s an especially pernicious technique because it can erode privacy across browser sessions and even different browsers — which an Internet user might be deliberately deploying to try to prevent profiling.

A ‘Strict Mode’ in the Firefox setting can be enabled by Firefox users in the latest release to block fingerprinting. But it’s not on by default.

Mozilla says a future release of the browser will flip fingerprinting blocking on by default too.

The latest changes in Firefox continue Mozilla’s strategy — announced a year ago — of pro-actively defending its browser users’ privacy by squeezing the operational range of tracking technologies.

In the absence of a robust regulatory framework to rein in the outgrowth of the adtech ‘industrial data complex’ that’s addicted to harvesting Internet users’ data for ad targeting, browser makers have found themselves at the coal face of the fight against privacy-hostile tracking technologies.

And some are now playing an increasingly central — even defining role — as they flip privacy and anti-tracking defaults.

Notably, earlier this month, the open source WebKit browser engine, which underpins Apple’s Safari browser, announced a new tracking prevention policy that puts privacy on the same footing as security, saying it would treat attempts to circumvent this as akin to hacking.

Even Google has responded to growing pressure around privacy — announcing changes to how its Chrome browser handles cookies this May. Though it’s not doing that by default yet.

It has also said it’s working on technology to reduce fingerprinting. And recently announced a long term proposal to involve its Chromium browser engine in developing a new open standard for privacy.

Though cynics might suggest the adtech giant is responding to competitive pressure on privacy by trying to frame and steer the debate in a way that elides its own role in data mining Internet users at scale for (huge) profit.

Thus its tardy privacy pronouncements and long term proposals look rather more like an attempt to kick the issue into the long grass and buy time for Chrome to keep being used to undermine web users’ privacy — instead of Google being forced to act now and close down privacy-hostile practices that benefit its business.

Malicious websites were used to secretly hack into iPhones for years, says Google

Security researchers at Google say they’ve found a number of malicious websites which, when visited, could quietly hack into a victim’s iPhone by exploiting a set of previously undisclosed software flaws.

Google’s Project Zero said in a deep-dive blog post published late on Thursday that the websites were visited thousands of times per week by unsuspecting victims, in what they described as an “indiscriminate” attack.

“Simply visiting the hacked site was enough for the exploit server to attack your device, and if it was successful, install a monitoring implant,” said Ian Beer, a security researcher at Project Zero.

He said the websites had been hacking iPhones over a “period of at least two years.”

The researchers found five distinct exploit chains involving 12 separate security flaws, including seven involving Safari, the in-built web browser on iPhones. The five separate attack chains allowed an attacker to gain “root” access to the device — the highest level of access and privilege on an iPhone. In doing so, an attacker could gain access to the device’s full range of features normally off-limits to the user. That means an attacker could quietly install malicious apps to spy on an iPhone owner without their knowledge or consent.

Google said based off their analysis, the vulnerabilities were used to steal a user’s photos and messages as well as track their location in near-realtime. The “implant” could also access the user’s on-device bank of saved passwords.

The vulnerabilities affect iOS 10 through to the current iOS 12 software version.

Google privately disclosed the vulnerabilities in February, giving Apple only a week to fix the flaws and roll out updates to its users. That’s a fraction of the 90 days typically given to software developers, giving an indication of the severity of the vulnerabilities.

Apple issued a fix six days later with iOS 12.1.4 for iPhone 5s and iPad Air and later.

Beer said it’s possible other hacking campaigns are currently in action.

The iPhone and iPad maker in general has a good rap on security and privacy matters. Recently the company increased its maximum bug bounty payout to $1 million for security researchers who find flaws that can silently target an iPhone and gain root-level privileges without any user interaction. Under Apple’s new bounty rules — set to go into effect later this year — Google would’ve been eligible for several million dollars in bounties.

A spokesperson for Apple did not immediately comment.

WebKit’s new anti-tracking policy puts privacy on a par with security

WebKit, the open source engine that underpins Internet browsers including Apple’s Safari browser, has announced a new tracking prevention policy that takes the strictest line yet on the background and cross-site tracking practices and technologies which are used to creep on Internet users as they go about their business online.

Trackers are technologies that are invisible to the average web user, yet which are designed to keep tabs on where they go and what they look at online — typically for ad targeting but web user profiling can have much broader implications than just creepy ads, potentially impacting the services people can access or the prices they see, and so on. Trackers can also be a conduit for hackers to inject actual malware, not just adtech.

This translates to stuff like tracking pixels; browser and device fingerprinting; and navigational tracking to name just a few of the myriad methods that have sprouted like weeds from an unregulated digital adtech industry that’s poured vast resource into ‘innovations’ intended to strip web users of their privacy.

WebKit’s new policy is essentially saying enough: Stop the creeping.

But — and here’s the shift — it’s also saying it’s going to treat attempts to circumvent its policy as akin to malicious hack attacks to be responded to in kind; i.e. with privacy patches and fresh technical measures to prevent tracking.

“WebKit will do its best to prevent all covert tracking, and all cross-site tracking (even when it’s not covert),” the organization writes (emphasis its), adding that these goals will apply to all types of tracking listed in the policy — as well as “tracking techniques currently unknown to us”.

“If we discover additional tracking techniques, we may expand this policy to include the new techniques and we may implement technical measures to prevent those techniques,” it adds.

“We will review WebKit patches in accordance with this policy. We will review new and existing web standards in light of this policy. And we will create new web technologies to re-enable specific non-harmful practices without reintroducing tracking capabilities.”

Spelling out its approach to circumvention, it states in no uncertain terms: “We treat circumvention of shipping anti-tracking measures with the same seriousness as exploitation of security vulnerabilities,” adding: “If a party attempts to circumvent our tracking prevention methods, we may add additional restrictions without prior notice. These restrictions may apply universally; to algorithmically classified targets; or to specific parties engaging in circumvention.”

It also says that if a certain tracking technique cannot be completely prevented without causing knock-on effects with webpage functions the user does intend to interact with, it will “limit the capability” of using the technique” — giving examples such as “limiting the time window for tracking” and “reducing the available bits of entropy” (i.e. limiting how many unique data points are available to be used to identify a user or their behavior).

If even that’s not possible “without undue user harm” it says it will “ask for the user’s informed consent to potential tracking”.

“We consider certain user actions, such as logging in to multiple first party websites or apps using the same account, to be implied consent to identifying the user as having the same identity in these multiple places. However, such logins should require a user action and be noticeable by the user, not be invisible or hidden,” it further warns.

WebKit credits Mozilla’s anti-tracking policy as inspiring and underpinning its new approach.

Commenting on the new policy, Dr Lukasz Olejnik, an independent cybersecurity advisor and research associate at the Center for Technology and Global Affairs Oxford University, says it marks a milestone in the evolution of how user privacy is treated in the browser — setting it on the same footing as security.

“Treating privacy protection circumventions on par with security exploitation is a first of its kind and unprecedented move,” he tells TechCrunch. “This sends a clear warning to the potential abusers but also to the users… This is much more valuable than the still typical approach of ‘we treat the privacy of our users very seriously’ that some still think is enough when it comes to user expectation.”

Asked how he sees the policy impacting pervasive tracking, Olejnik does not predict an instant, overnight purge of unethical tracking of users of WebKit-based browsers but argues there will be less room for consent-less data-grabbers to manoeuvre.

“Some level of tracking, including with unethical technologies, will probably remain in use for the time being. But covert tracking is less and less tolerated,” he says. “It’s also interesting if any decisions will follow, such as for example the expansion of bug bounties to reported privacy vulnerabilities.”

“How this policy will be enforced in practice will be carefully observed,” he adds.

As you’d expect, he credits not just regulation but the role played by active privacy researchers in helping to draw attention and change attitudes towards privacy protection — and thus to drive change in the industry.

There’s certainly no doubt that privacy research is a vital ingredient for regulation to function in such a complex area — feeding complaints that trigger scrutiny that can in turn unlock enforcement and force a change of practice.

Although that’s also a process that takes time.

“The quality of cybersecurity and privacy technology policy, including its communication still leave much to desire, at least at most organisations. This will not change fast,” says says Olejnik. “Even if privacy is treated at the ‘C-level’, this then still tends to be about the purely risk of compliance. Fortunately, some important industry players with good understanding of both technology policy and the actual technology, even the emerging ones still under active research, treat it increasingly seriously.

“We owe it to the natural flow of the privacy research output, the talent inflows, and the slowly moving strategic shifts as well to a minor degree to the regulatory pressure and public heat. This process is naturally slow and we are far from the end.”

For its part, WebKit has been taking aim at trackers for several years now, adding features intended to reduce pervasive tracking — such as, back in 2017, Intelligent Tracking Prevention (ITP), which uses machine learning to squeeze cross-site tracking by putting more limits on cookies and other website data.

Apple immediately applied ITP to its desktop Safari browser — drawing predictable fast-fire from the Internet Advertising Bureau whose membership is comprised of every type of tracker deploying entity on the Internet.

But it’s the creepy trackers that are looking increasingly out of step with public opinion. And, indeed, with the direction of travel of the industry.

In Europe, regulation can be credited with actively steering developments too — following last year’s application of a major update to the region’s comprehensive privacy framework (which finally brought the threat of enforcement that actually bites). The General Data Protection Regulation (GDPR) has also increased transparency around security breaches and data practices. And, as always, sunlight disinfects.

Although there remains the issue of abuse of consent for EU regulators to tackle — with research suggesting many regional cookie consent pop-ups currently offer users no meaningful privacy choices despite GDPR requiring consent to be specific, informed and freely given.

It also remains to be seen how the adtech industry will respond to background tracking being squeezed at the browser level. Continued aggressive lobbying to try to water down privacy protections seems inevitable — if ultimately futile. And perhaps, in Europe in the short term, there will be attempts by the adtech industry to funnel more tracking via cookie ‘consent’ notices that nudge or force users to accept.

As the security space underlines, humans are always the weakest link. So privacy-hostile social engineering might be the easiest way for adtech interests to keep overriding user agency and grabbing their data anyway. Stopping that will likely need regulators to step in and intervene.

Another question thrown up by WebKit’s new policy is which way Chromium will jump, aka the browser engine that underpins Google’s hugely popular Chrome browser.

Of course Google is an ad giant, and parent company Alphabet still makes the vast majority of its revenue from digital advertising — so it maintains a massive interest in tracking Internet users to serve targeted ads.

Yet Chromium developers did pay early attention to the problem of unethical tracking. Here, for example, are two discussing potential future work to combat tracking techniques designed to override privacy settings in a blog post from nearly five years ago.

There have also been much more recent signs Google paying attention to Chrome users’ privacy, such as changes to how it handles cookies which it announced earlier this year.

But with WebKit now raising the stakes — by treating privacy as seriously as security — that puts pressure on Google to respond in kind. Or risk being seen as using its grip on browser marketshare to foot-drag on baked in privacy standards, rather than proactively working to prevent Internet users from being creeped on.

Netflix is testing a pop-out floating video player on desktop

Netflix is testing out a new feature that could mean you never have to stop watching, not even while you work – it’s a pop-out video player, similar to the one you may be used to from iOS and macOS for any website or app that supports Safari’s native video player. Basically, that means you can choose to ‘pop out’ the video and then reposition it anywhere on your screen for a picture-in-picture effect that remains visible over any other apps you might be using.

The streaming company told Engadget, which found this experimental feature, that it’s only a test, but you can see why this might be a useful feature for users. Netflix could offer this already to iOS and Mac users using built-in system tools, but because it uses is own player (in part likely for copyright protection), it instead has to build its own feature. The benefit of this is that it should be coming to both Windows PCs and Macs should it graduate from being an experiment to being a full-fledged product.

India’s largest video streaming service, owned by Disney, breaks Safari compatibility to fix security flaw

Hotstar, India’s largest video streaming service with more than 300 million users, disabled support for Apple’s Safari web browser on Friday to mitigate a security flaw that allowed unauthorized usage of its platform, two sources familiar with the matter told TechCrunch.

The incident comes at a time when the streaming service — operated by Star India, part of 20th Century Fox that Disney acquired — enjoys peak attention as millions of people watch the ongoing ICC World Cup cricket tournament on its platform.

As users began to complain about not being able to use Hotstar on Safari, the company’s official support account asserted that “technical limitations” on Apple’s part were the bottleneck. “These limitations have been from Safari; there is very little we can do on this,” the account tweeted Friday evening.

Sources at Hotstar told TechCrunch that this was not an accurate description of the event. Instead, company’s engineers had identified a security hole that was being exploited by unauthorized users to access Hotstar’s content, they said.

Hotstar intends to work on patching the flaw soon and then reinstate support for Safari, the sources said.

The security flaw can only be exploited through Safari’s desktop and mobile browsers. On its website, the company recommends users to try Chrome and Firefox, or its mobile apps, to access the service. Hotstar did not respond to requests for comment.

Hotstar, which rivals Netflix and Amazon Prime Video in India, maintains a strong lead in the local video streaming market (based on number of users and engagement). Last month, it claimed to set a new global record by drawing more than 18 million viewers to a live cricket match.

What Chrome’s browser changes mean for your privacy and security

At the risk of sounding too optimistic, 2019 might be the year of the private web browser.

In the beginning, browsers were a cobbled together mess that put a premium on making the contents within look good. Security was an afterthought — Internet Explorer is no better example — and user privacy was seldom considered as newer browsers like Google Chrome and Mozilla Firefox focused on speed and reliability.

Ads kept the internet free for so long but with invasive ad-tracking at its peak and concerns about online privacy — or lack of — privacy is finally getting its day in the sun.

Chrome, which claims close to two-thirds of all global browser market share, is the latest to double down on new security and privacy features after Firefox announced new anti-tracking blockers last month, Microsoft’s Chromium-based Edge promised better granular controls to control your data, and Apple’s Safari browser began preventing advertisers from tracking you from site to site.

At Google’s annual developer conference Tuesday, Google revealed two new privacy-focused additions: better cookie controls that limit advertisers from tracking your activities across websites, and a new anti-fingerprint feature.

In case you didn’t know: cookies are tiny bits of information left on your computer or device to help websites or apps remember who you are. Cookies can keep you logged into a website, but can also be used to track what a user does on a site. Some work across different websites to track you from one website to another, allowing them to build up a profile on where you go and what you visit. Cookie management has long been an on or off option. Switching cookies off mean advertisers will find it more difficult to track you across sites but it also means websites won’t remember your login information, which can be an inconvenience.

Soon, Chrome will prevent cross-site cookies from working across domains without obtaining explicit consent from the user. In other words, that means advertisers won’t be able to see what you do on the various sites you visit without asking to track you.

Cookies that work only on a single domain aren’t affected, so you won’t suddenly get logged out.

There’s an added benefit: by blocking cross-site cookies, it makes it more difficult for hackers to exploit cross-site vulnerabilities. Through a cross-site request forgery attack, it’s possible in some cases for malicious websites to run commands on a legitimate site that you’re logged into without you knowing. That can be used to steal your data or take over your accounts.

Going forward, Google said it will only let cross-site cookies travel over HTTPS connections, meaning they cannot be intercepted, modified or stolen by hackers when they’re on their way to your computer.

Cookies are only a small part of how users are tracked across the web. These days it’s just as easy to take the unique fingerprints of your browser to see which sites you’re visiting.

Fingerprinting is a way for websites and advertisers to collect as much information about your browser as possible, including its plugins and extensions, and your device, such as its make, model and screen resolution, which creates a unique “fingerprint that’s unique to your device. Because they don’t use cookies, websites can look at your browser fingerprint even when you’re in incognito mode or private browsing.

Google said — without giving much away as to how — it “plans” to aggressively work against fingerprinting, but didn’t give a timeline of when the feature will roll out.

Make no mistake, Google is stepping up to the privacy plate, following in the footsteps of Apple, Mozilla and Microsoft. Now that Google’s on board, that’s two-thirds of the internet set to soon benefit.