|
Intel's slow trickle of information on its Tiger Lake processors recently turned into a veritable flood as the company shared information about its first salvo of 10nm SuperFin chips, but one detail was missing: Any official disclosures of chips with more than four cores. That changed in a decidedly low-key way, as a blog post from Intel fellow Boyd Phelps on Medium reveals that the company will introduce eight-core models soon, saying: "We also added a 3MB non-inclusive last-level-cache (LLC) per core slice. A single core workload has access to 12MB of LLC in the 4-core die or up to 24MB in the 8-core die configuration (more detail on 8-core products at a later date)." Intel claims that it's four-core Tiger Lake models, by virtue of their 10nm SuperFin Process, Willow Cove Cores, and Iris XE graphics can already beat AMD's eight-core Renoir chips in some performance benchmarks. If Intel's performance projections for its quad-core models are accurate, the eight-core Tiger Lake models could prove to be exceedingly competitive against AMD's existing Ryzen Mobile 'Renoir' lineup, possibly even wresting away the lead in threaded applications. We've yet to see independent third-party verification of the quad-core Tiger Lake chips in reviews, but AMD's upcoming Zen 3 "Cezanne" APUs are now extremely important as AMD looks to keep its performance advantage in the laptop market despite the looming eight-core Tiger Lake models. The current dual- and quad-core Tiger Lake chips address only the 7 to 28W segment, while larger eight-core Tiger Lake-H processors would obviously tackle the upper echelons of the performance market, possibly stretching up to 45W models (~65W peak) for H-series Core i9 and i7 models. We won't go into Tiger Lake's full technical details, we have all of those resources in one place here, but Intel's plans for eight-core Tiger Lake models aren't entirely surprising. Intel's current 10th-gen lineup includes 10nm Ice Lake processors that address the iGPU gaming market with up to four cores, while the 14nm Comet Lake processors slot in for high-performance productivity workloads. However, Intel told us during its Tiger Lake briefings that all of its future laptop chips will come with the 10nm SuperFin (or better) process, meaning the company won't have a split product stack for its 11th-gen lineup. Much of Intel's previous limitations on its Ice Lake models stemmed from the low clock frequencies and poor yields, both of which conspired to limit performance and core counts - Intel's best 10nm efforts thus far have resulted in quad-core chips for laptops. Intel's new 10nm SuperFin process has corrected the clock speed issues, we see up to a 700 MHz increase to base and boost frequencies, and the emergence of eight core models imply that defect rates are lower, and thus yields are up, allowing Intel to punch out 10nm laptop chips with up to eight cores. Intel has no plans to bring Tiger Lake to its lineup of desktop chips, but we have already seen the first new Tiger Lake NUCs emerge from ASRock. Naturally, eight-core Tiger Lake models will also work their way into the NUC lineups. Given their pairing with the Xe graphics engine, they could prove to pack a decent performance punch for compact desktop PCs. Stay abreast on this and other news from Intel by visiting OUR FORUM. Earlier this summer, marine specialists reeled up a shipping-container-size datacenter coated in algae, barnacles, and sea anemones from the seafloor off Scotland’s Orkney Islands. The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacenters is feasible, as well as logistically, environmentally, and economically practical. Microsoft’s Project Natick team deployed the Northern Isles datacenter 117 feet deep to the seafloor in spring 2018. For the next two years, team members tested and monitored the performance and reliability of the datacenter’s servers. The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of data centers. On land, corrosion from oxygen and humidity, temperature fluctuations, and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure. The Northern Isles deployment confirmed their hypothesis, which could have implications for data centers on land. Lessons learned from Project Natick also are informing Microsoft’s datacenter sustainability strategy around energy, waste, and water, said Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick. What’s more, he added, the proven reliability of underwater datacenters has prompted discussions with a Microsoft team in Azure that’s looking to serve customers who need to deploy and operate tactical and critical datacenters anywhere in the world. “We are populating the globe with edge devices, large and small,” said William Chappell, vice president of mission systems for Azure. “To learn how to make data centers reliable enough not to need human touch is a dream of ours.” The underwater datacenter concept splashed onto the scene at Microsoft in 2014 during ThinkWeek, an event that gathers employees to share out-of-the-box ideas. The concept was considered a potential way to provide lightning-quick cloud services to coastal populations and save energy. More than half the world’s population lives within 120 miles of the coast. By putting datacenters underwater near coastal cities, data would have a short distance to travel, leading to fast and smooth web surfing, video streaming, and game playing. The consistently cool subsurface seas also allow for energy-efficient datacenter designs. For example, they can leverage heat-exchange plumbing such as that found on submarines. Microsoft’s Project Natick team proved the underwater datacenter concept was feasible during a 105-day deployment in the Pacific Ocean in 2015. Phase II of the project included contracting with marine specialists in logistics, shipbuilding, and renewable energy to show that the concept is also practical. “We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” Cutler said. “We have done what we need to do. Natick is a key building block for the company to use if it is appropriate.” We have pictures, videos, and more posted on OUR Forum. Apple revised its App Store guidelines on Friday ahead of the release of iOS 14, the latest version of the iPhone operating system, which is expected later this month. Apple’s employees use these guidelines to approve or deny apps and updates on the App Store. Those rules have come under intense scrutiny in recent weeks from app makers who argue iPhone maker has too much control over what software runs on iPhones and how Apple takes a cut of payments from those apps. In particular, Epic Games, the maker of Fortnite, is in a bitter legal battle with Apple over several of its guidelines, including its requirement to use in-app purchases for digital products. Apple removed Fortnite from its app store last month. One major update on Friday relates to game streaming services. Microsoft and Facebook have publicly said in recent months that Apple’s rules have restricted what their gaming apps can do on iPhones and iPads. Microsoft’s xCloud service isn’t available on iOS, and Facebook’s gaming app lacks games on iPhones. Apple now says that game streaming services, such as Google Stadia and Microsoft xCloud, are explicitly permitted. But there are conditions: Games offered in the service need to be downloaded directly from the App Store, not from an all-in-one app. App makers are permitted to release a so-called “catalog app” that links to other games in the service, but each game will need to be an individual app. Apple’s rules mean that if a streaming game service has 100 games, then each of those games will need an individual App Store listing as well as a developer relationship with Apple. The individual games also have to have some basic functionality when they’re downloaded. All the games and the stores need to offer in-app purchases using Apple’s payment processing system, under which Apple usually takes 30% of revenue. “This remains a bad experience for customers. Gamers want to jump directly into a game from their curated catalog within one app just like they do with movies or songs, and not be forced to download over 100 apps to play individual games from the cloud,” a Microsoft representative said in a statement. A Google representative declined to comment. The rules underscore the tension between Apple’s control of its platform, which it says is for safety and security reasons, and emerging gaming services considered by many to be the future of the gaming industry. Gaming streaming services want to act as a platform for game makers, such as approving individual games and deciding which games to offer, but Apple wants the streaming services to act more like a bundle of games and says it will need to review each individual game. Apple does not have a cloud gaming service, but it does sell a subscription bundle of iOS games called Apple Arcade. Another change relates to in-person classes purchased inside an iPhone app. This spring, amid the pandemic, several companies that previously enabled users to book in-person products, like Classpass, started offering virtual classes. Apple’s rules previously said that virtual classes were required to use Apple’s in-app payment process. Apple’s new guidelines say that one-on-one in-person virtual classes, like fitness training, can bypass Apple for payment, but classes, where one instructor is teaching more a class with multiple people, will still require apps to use Apple’s in-app purchases. For more turn to OUR FORUM. |
Latest Articles
|


