On October 4, 2018,
Bloomberg Businessweek published an article detailing how China included a tiny microchip on server motherboards in an attempt to bypass corporate security at some major companies, including Amazon and Apple. They described an intricate plot, involving manufacturing plants in China that produced motherboards for Supermicro server hardware. They claim that Amazon noticed the chip, which they reported to US authorities, who have spent over 3 years investigating. The article cites information from insiders at Amazon, Apple, and the Federal government. Newsweek felt this investigative piece, which covers incidents dating back as far as 2015, was important enough that it was the cover story for October 8, 2018.
The story surprised almost nobody in the technology industry. The idea that a Chinese company could be purposely inserting spy technology into products they manufacture is not a far-fetched one. In fact, two Chinese-owned smartphone brands have previously been banned from import into the US over fears that they contained technology designed to spy on US citizens and, hopefully, intercept calls containing sensitive data. To extend the threat from smartphones to servers was a fairly mundane and, frankly, expected.
There are a few in the industry who take particular exception to the story, however; namely, the companies mentioned by name.
Amazon claims that they never knew anything of compromised server hardware and have never been in contact with Federal law enforcement, either in reporting or in questioning, regarding the topic. They say that the only issues they have found regarding Supermicro servers were in a web-based application designed for server management, which was addressed prior to implementing the hardware. They say they have no record of any hardware issues ever being reported for hardware.
Apple had a
similar response to the article, claiming that they also never had any hardware incidents with Supermicro and the first they were aware of the concept was when Bloomberg themselves started contacting the company asking questions. They also claim that the fact that the company canceled their contract with Supermicro to purchase over 30,000 servers immediately following the timeline Bloomberg claims would have been the disclosure of the server hacks is unrelated.
This week, Apple CEO Tim Cook, who has taken this story very personally, has upped the denial rhetoric. In fact, he has gone so far as to demand
Bloomberg retract the entire story. He told Buzzfeed, I personally talked to the Bloomberg reporters along with Bruce Sewell, who was then our general counsel. We were very clear with them that this did not happen, and answered all their questions. Each time they brought this up to us, the story changed, and each time we investigated we found nothing...
We turned the company upside down. Email searches, data center records, financial records, shipment records. We really forensically whipped through the company to dig very deep and each time we came back to the same conclusion: This did not happen. There's no truth to this.
For Cook, this seems to be some sort of personal attack, either on his credibility or his intelligence; maybe both. To have employees of Apple being part of the investigation, and 4 Federal agents claiming that Apple both reported and participated in the investigation when he believes that it never happened does not seem to be something that he can heal from. It could have to do with the relationship that Supermicro has with Foxconn, who also manufactures most of Apple's products. A stain on their manufacturing process could leave a stain on all of Apple's hardware and security, which is something that has been in question following a couple of security issues at the company. If Cook had ignored it, no one would even remember the report today, but he keeps picking at it, meaning that it keeps being brought to the top of everyone's minds.
Considering their commitment to the story, it is unlikely that
Bloomberg is going to retract the story, no matter how much noise Cook makes, though anything is possible at this point.
After AT&T won its bid to purchase Time Warner and got government approval to do so, the company has quickly been making changes around their acquisition. The company quickly
made changes at HBO, even while the Department of Justice was renewing their fight against the merger. They have also been offering bundles of their combined services, like DirecTV and wireless service, along with HBO streaming.
The newest move for the company is to create yet
another streaming service. This move is likely the public confirmation of a streaming service mentioned in court in April. The service will feature the company's own content, including CNN and HBO, as well as content from partners, such as DC Comics. According to John Stankey, CEO of WarnerMedia (formerly Time Warner), Our service will start with HBO and the genre defining programming that viewers crave. On top of that we will package content from Turner and Warner Bros. with their deep brand connections that touch both diverse interests and mass audiences.
That certainly makes this new service sound similar to the business model of
Amazon Prime Video or Hulu (which WarnerMedia holds a 10% stake in), where there will be a base product and with add-ons, though that is merely speculation. Details, such as pricing and availability, have not yet been announced; just that the service will launch at some point in quarter 4 2019. Having a brand-owned service like this, which brings content directly to consumers, is becoming a popular model. Stankey said, While going direct-to-consumer gives us an additional opportunity to reach audiences that aren't part of a traditional subscription service, our wholesale relationships will continue to be an important distribution channel. So, it will be a priority to work with our partners to deliver a compelling and competitive product that will complement our wholesale distribution, allowing us to reach the largest number of viewers.
Of course, this new streaming service will also compete with AT&T's DirecTV Now, which also brings content directly to consumers. It does, however, give the company a different style of distribution: appointment style and on-demand. It is a double-edged sword, though, as
more services means more piracy.
Since is launched in 2011, Google+ has been the butt of nearly every joke in the tech industry. A favorite is the idea that the only people who use Google+ are Google employees, being forced to do so to try and show some sort of usage. Of course, there are more users than employees, but not by a huge margin. Over the years Google has done a lot to try and force people to interact with Google+. The most famous of which was
forcing YouTube integration, making users have a Google+ account to comment on videos. None of these tricks worked.
Google has been having an internal debate for several years, whether to continue the fight or walk away. They've walked away from other massive mistakes, such as
Google Buzz, a product that not even Google knew what was for. This week, Google's decision was finally made: walk away. The company has finally decided that the trouble of Google+ is finally more than it cares bare and will begin the process of shuttering the consumer side of the product. If you are one of the very few people who used the platform and has content there that will be lost, Google has created an export process.
The stated reason for the shutdown, however, doesn't have anything to do with incredibly low usage. Instead, the company claims that the shutdown is caused by a security bug that was patched earlier in the year. The bug, which is similar in nature to
Facebook's recent issue, in that the company claims that they have no evidence that any data was exposed incorrectly, only that it could have been. In Google's case, however, it was more than API keys that were available: it included name, email, gender, job, and age; enough to do some real-world damage.
The bug was introduced into the software in 2015 and discovered early in the year, being patched in March. Unfortunately for users, Google decided that it didn't need to disclose the issue publicly. This is contrary to the way they treat other companies, however. If Google discovers a bug like this in someone else's software, they give
90 days to fix it before disclosing it themselves. It apparently doesn't apply to their own software, though.
The issues were
disclosed by , followed by Google's announcement about the future of the product. As one would expect, users are not happy about the lack of disclosure and have Wall Street Journal filed a proposed class action against the company. The suit claims negligence, invasion of privacy, and more. Attorney Joshua Watson wrote, Worse, after discovery of this vulnerability in the Google+ platform, Defendants kept silent for at least seven months, making a calculated decision not to inform users that their Personal Information was compromised, further compromising the privacy of consumers' information and exposing them to risk of identity theft or worse.
Even the government is unhappy about the scenario. Three US Senators sent
a letter to Google, in part saying, Please describe in detail when and how Google became aware of this vulnerability and what actions Google took to remedy it.
Why did Google choose not to disclose the vulnerability, including to the Committee or to the public, until many months after it was discovered?
Are there similar incidents which have not been publicly disclosed?
This is similar in nature to what happened after the Cambridge Analytica breach at Facebook. It is liklely that, in addition to the class action suit, someone from Google is going to be speaking in front of Congress soon.
It has long been known that Microsoft was working on an Azure-powered Xbox streaming platform. At E3 this year, the project was confirmed, with a promise of further details to come. Following
Google's Project Stream going into beta last week, Microsoft has made good on that promise, with new details released this week.
The project is currently known as Project xCloud, likely a combination of Xbox and Azure Cloud, being as those are the platforms being combined to make it possible. The project represents the next generation of Microsoft's Play Anywhere initiative, which brought Xbox games to PC, and vice versa. That feature required a game to be built using Microsoft's Universal Windows Platform, which is a platform designed to allow software to run on a variety of devices, including PC, Xbox, Hololens, embedded systems (like an arcade cabinet), and more, or have a special relationship with Microsoft. UWP has its limitations, though, and has not gotten complete buy-in, which has meant that the catalog is not huge.
The thing that makes Project xCloud a step forward is the fact that developers need to do absolutely nothing special to make their games compatible with the platform. Any game that can run on the Xbox One can be streamed to devices. That means that the catalog at launch could be massive: far larger than any other game streaming platform.
Another thing that sets Project xCloud apart is the platforms that it will support. In essence, if the device can pair a Bluetooth Xbox controller, it is likely going to be able to stream games. Microsoft is also testing touch input, so that could expand the lineup of devices. This means that nearly any phone or tablet, provided it has enough resources, will be able to play any of the games. This is because of the way game streaming works: the game does not actually run and render on the device but instead is run and rendered on a server, and only the video is streamed to the device. This has been a technological concern for other companies that have tried it, but Microsoft has ideas to overcome the downfalls.
Developers and researchers at Microsoft Research are creating ways to combat latency through advances in networking topology, and video encoding and decoding. Project xCloud will have the capability to make game streaming possible on 4G networks and will dynamically scale to push against the outer limits of what's possible on 5G networks as they roll out globally. Currently, the test experience is running at 10 megabits per second. Our goal is to deliver high-quality experiences at the lowest possible bitrate that work across the widest possible networks, taking into consideration the uniqueness of every device and network.
Being as Microsoft has Azure datacenters across the globe, they might be the first company that can actually overcome the biggest issue: latency. Pressing a button on your controller and having to wait an extended period of time to see a result prevents quick action gaming, which is required for most big titles. The fact that they are already working to eliminate the problem on only a 10Mbps connection (just barely above 1 MB per second) is impressive.
Microsoft will begin public trials of the technology in 2019, testing capabilities, stress and more. As the tests get closer, we will make sure to keep you updated on the development.