Since Nintendo announced their intention to
enter the mobile market with their games, the company has launched 3 Nintendo-branded and 1 partner title. The most recent mobile game to come out of Nintendo is Animal Crossing: Pocket Camp, a game that, theoretically, follows the path set before it by other games in its family.
The big difference, of course, is in the game's revenue model. While other
Animal Crossing games are a particular price for the game and you play it to your heart's content, this game is free to download. And download people have. In the first 6 days the game was available, it received over 15 million downloads worldwide, placing it right behind Super Mario Run.
But downloads don't always translate into revenue, as Nintendo is finding out the hard way. While a lot of people seem to be playing, few seem to be paying. In fact, of the 3 Nintendo-proper titles, it has generated the least amount of revenue in the first 9 days - by A LOT.
Fire Emblem Heroes, the lowest downloaded of the 3, had generated the most revenue at this point, with $33 million. Second was Super Mario Run with $24 million. Lastly, we have Pocket Camp with only $10 million. In addition, the global distribution of spending is definitely not even, with Japan representing over 40% of all revenue, and the United States representing just over 1%, at about $120,000 total.
The problem for Nintendo comes from the fact that there is just no real urgency present in the game. The in-game purchases are Leaf Tickets, a resource present in other games as well. These tickets are used to speed up processes or compensate for missing resources for crafting items. There is no real benefit in this type of game to spending money to speed up the process, though. You're not trying to defend territory, you're not trying to amass armies, you're not even trying to fight Pokémon - you simply don't have any reason to purchase Tickets.
The company seems to have recognized this issue, either before launching the game or immediately following launch. A timed event was launched shortly after the game, with a Christmas theme, which includes limited-time items. This generates an urgency for new items, which could potentially generate revenue, as people rush to get the items that can only be crafted before Christmas itself.
The long-lasting image of a company is not built upon how they handle themselves when things are going well, but instead is built upon how they handle a crisis. For example, a year from now, people will only have one image of Equifax: the recipient of a preventable data breach. The image of Yahoo was so damaged by the revelation of several breaches that Verizon almost canceled
their planned purchase of the company.
This week, a new company has been added to the list of massive failures during a crisis: BLU Products. The Miami, Florida based company designs and builds low cost Android phones, and previously built Windows Phones as well. A firmware update released this week to their Life One X2 handset had an unexpected side effect: complete handset failure. Customers have been posting complaints on the company's
Facebook page, as well as their BLU SubReddit.
Unfortunately, the company has not done anything to deal with the problem. Instead, there is a canned response that BLU support has been using in reply to every comment, on every Facebook post regarding the issue. It reads,
Hi **person's name**, we are aware of the issue with the Life One X2, and currently working on it. We apologize for the inconvenience.
This is as close to a public statement as we have received from BLU. The latest post under their
News section on their website is from Halloween, announcing their first Sprint phone. On Facebook, there has been no statement aside from the canned response, but there have been a number of new posts since the incident began. Several posts about the soccer team they sponsor, a few advertisements about their rewards program, and one general advertisement for BLU phones in general.
What's worse than the company's complete lack of a public response has been their private response. Apparently, the company has absolutely no contingency plan in place for a flawed firmware release. In the case of companies like Microsoft or Samsung, if a firmware update goes sideways, there is a process in place to revive the phone without losing data, or at least retrieve data before resetting the phone. With BLU, there is nothing; if you trusted your data to the company's phone and you received the flawed firmware update, your data is gone. To get the phone working again, your only choice is to do a full reset.
This is one of the sacrifices you make when you decide to go with a low-cost provider for any product or service. No one goes to Wal-Mart expecting good customer service; they go for the price. The same is true here - the choice was made for price, not because the company is a powerhouse.
YouTube has once again come under scrutiny for its automation processes. Just a few months ago, the company's
advertising automation caused a massive advertiser boycott, including large companies such as AT&T, Enterprise Rent-A-Car, Johnson & Johnson and Verizon. Now they are having trouble with parents who have noticed some incredibly inappropriate content appearing in the YouTube Kids platform.
The company's algorithms began allowing adult-oriented content to appear in the family-friendly Kids platform. For example, Mickey Mouse in a pool of blood, or a Claymation animation of Spider-Man urinating on
Frozen's Elsa. There has also been an influx of sexual comments on videos of children, particularly on newer "challenge" videos, such as the yoga challenge or the ice pants challenge. Buzzfeed recently revealed a number of videos in the platform showing children being abused or in vulnerable situations.
Because of these issues, YouTube has put new policies in place to try and eliminate, or at least slow, the problem. The highlights of the new policies are,
Tougher application of community guidelines and faster enforcement through technology Removing ads from inappropriate videos targeting families Blocking inappropriate comments on videos featuring minors Providing guidance for creators who make family-friendly content Engaging and learning from experts
On the surface, these measures sound reasonable, but there are a few issues. Chief among them, the definitions of "inappropriate" and "targeting families" are incredibly vague. The company's only example is a 5-month-old guidance on using family-oriented characters in violent or sexual situations. However, the terminology could be applied to content, such as videogame videos, that include a player swearing. For example, if you have ever watched a
Super Mario Maker speed-runner, you know that they can get verbally abusive to the game. Even though the videos are not aimed at children or families, this definition could demonetize said videos for no real reason.
If YouTube really wants to deal with the content on YouTube Kids, they should implement an opt-in on video uploads. Then, the algorithms could scan that content to ensure it meets the regulations, rather than assuming that every video could be a potential match for the service. For example, we are aware that not all of our shows would be entirely appropriate for kids, so we would not opt-in on those episodes. This would, of course, limit the number of videos in the service, so adversely, they could allow an opt-out for those videos that could get caught in the filter.
Since the 90s, there has been a struggle between the tech industry and the government over the implementation of encryption. The tech industry has always argued that people deserve and need the ability to protect their data. For example, corporations might want to protect their financial data, or information about new product development. Individuals also have the right to protect themselves from theft. Encrypting photos could have helped protect
Apple users during the iCloud hack.
On the other side of the argument is the government, which believes that encryption should be breachable by them. For over 2 decades, law enforcement has claimed that strong encryption makes data warrant-proof, and they need a way to be able to deal with that inevitability. The Department of Justice has petitioned time and again for what Deputy Attorney General Rod Rosenstein is currently calling "responsible encryption," which essentially means that they expect data companies to provide a backdoor through which they can sneak into your data.
Tech companies have refused to comply with this request, sometimes taking the issue to court. Apple famously fought an order to
unlock an iPhone owned by one of the San Bernardino attackers. They claimed that, once a backdoor was created, it would exist and create an ever-present threat. This is the same argument made against backdoors in standard encryption processes.
discussion at Ars Live, Riana Pfefferkorn, a legal fellow at the Stanford Center for Internet and Society, made a the point that the term responsibility is misleading. I think what Rosenstein is getting at is that he believes that companies in their deployment of encryption should be responsible to law enforcement above all and public safety rather than being responsible to their users or the broader security ecosystem.
She believes that the topic has come up again now because the DOJ smells "blood in the water" after the perceived mishandling of data regarding Russia during the presidential election, despite the fact that the two topics are unrelated. But the DOJ's arguments and tactics have not really changed in over 2 decades. In 2015, a
research paper was released, showing that nothing has changed on either side. The complexity of today's Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard-to-detect security flaws. Beyond these and other technical vulnerabilities, the prospect of globally deployed exceptional access systems raises difficult problems about how such an environment would be governed and how to ensure that such systems would respect human rights and the rule of law.
From a software standpoint, the important argument here is that a backdoor is a backdoor; once it exists, it exists. There is no way to secure a backdoor, because people are people. A key will leak, because there are a lot of people involved: developers, law enforcement, government officials. If the breach exists, it will be exploited immediately, either by someone involved, a hacker or law enforcement without a warrant. No good can possibly come from making encryption insecure.
bought Twitch competitor Beam in 2016, it brought with it the promise of more social streaming features to its biggest game franchises. Arguably their biggest franchise has just received those capabilities on the rebranded Microsoft Mixer - Minecraft. While the alliteration might be enough for some, the new features are even more exciting.
As with other games that implement Mixer directly into their games, players can stream directly from within the game without any need for additional software. This allows for the game to implement the features that sets Mixer apart from its competitors: social control. Viewers have the ability to control the player's environment, spawn obstacles and even enemies.
In terms of
Minecraft, viewers could potentially have the ability to change night to day, add elements to the player's environment, or even spawn monsters to challenge the player. This makes the process more automated and far less manual, like it would be on Twitch or YouTube Gaming, where viewers can only flood the chatroom with comments, hoping that enough of them will cause the player to trigger the event themselves.
For the player, however, there are ways to put limitations on what the viewers can do. If you don't want to allow viewers to add items to your landscape, you can turn it off. If you want to limit the ability to add enemies, you can. This gives control to both the streamer and viewer in tandem.
These new features are available now for Windows 10, Xbox One and Android version 1.2.5.