KT_Austin

KT_Austin

I love my computer because all my friends live inside it

Welcome to AustinLaptop.Com

 

The job of amalgamating threat intelligence is difficult. There is nonetheless much progress being made to communicate risks, make them actionable or take automated action, all while authenticating sources of the threats in a useful way. Among several ecosystems, STIX, TAXII and CyBOX have become popular vehicles and methods for communicating and storing threat intelligence. Generally speaking, amalgamation is the future of threat intelligence, but processes and protections need to be put into place first so that the systems designed to protect companies don't instead put them at risk. 

STIX, TAXII and CyBOX combine to make an information ecosystem:

  • Structured Threat Information eXpression (STIX) is a language defined for describing characteristics of security threats.
  • Trusted Automated eXchange of Indicator Information(TAXII) describes the actions of services and messages.
  • Cyber Observable eXpression (CyBOX) is a schema for storing the states described in the language of STIX, and messaged or processed in some manner by TAXII. CyBOX is due to be placed into the description language of STIX shortly, as a part of the STIX 2.0 specification.

When stitched together, STIX, TAXII and CyBOX provide the basis for an automated interchange of threat data--much like how Amazon processes an order through its information delivery processes. The importance of a threat intelligence ecosystem cannot be understated.

My anecdotal experience with amalgamated threat intelligence has been positive. Every day, the websites I administrate receive threat intelligence through an app called WordFence. In turn, WordFence uses its presence on thousands of sites to aggregate threats and dubious hits that each WordFence-protected site is seeing. This data becomes a set of distributed blocks against specific IP addresses. Although this model is sophisticated in many ways, it’s also primitive. Why? There is no amalgamation of vendors like WordFence using this technique.

What if we--meaning all systems security entities--shared data about the bad guys? Can we be sure they’re bad? WordFence has a process to determine this, based on distributed behavioral intelligence. It’s my belief that we create an inner-Internet by doing so. This bifurcation has its strengths, but also its potential weaknesses.

 

There is possible risk: mis-identification. I could see a Max Headroom-like episode occur where, suddenly, your organization’s IP and or DNS addresses became blacklisted by a cabal of threat intelligence sources. This recently happened to Zoho, which was shut down by its hosting company--ostensibly for highly suspect traffic. Imagine going off the radar because your organization’s IP addresses have been evaluated as hostile due to fraudulent activity doxxing. DNS is fine, but an aggregation of sources believes your organization to be hostile. There would be a crater on the Internet where your turf once stood. Amalgamated threat intelligence is powerful, but procedural mechanisms need to evolve before widespread implementation becomes both cautious and effective.

Fortunately, there are protections in place to authenticate sources of data so that fraud or incorrect information can be uprooted and propagated--all while spreading info about the bad guys in a way that enables organizations to take action on the data. If a site I control is somehow hijacked, blacklisting the site also means I can’t go in and fix it. Rules need to be in place, protocols designed and matched.

The threat intelligence transfer STIX/TAXII approach is sanctioned by the U.S. Computer Emergency Readiness Team, also known as US-CERT.It’s put into practice by a growing number of products that are designed to enhance each other, although the actions taken when presented with information are still up to the administrative controls of software.

Many threat intelligence vendors are putting amalgamation models based on the STIX/TAXII specification to work. A good example is the end-point protection scheme offered by Carbon Black. Without going into strenuous detail, CarbonBlack is an endpoint wrapping ecosystem that both records activity and assuages threats. In turn, CarbonBlack admins can take Anamoli-fed amalgamated data inputs to augment the inputs received by the wrappers on its endpoint devices, potentially shortening the time when new or novel behaviors are found. The two companies are independent of each other, and the Anamoli feeds likely augment the attack surfaces covered through CarbonBlack’s own endpoint protection.

 

There are also formalized industrial networks that perform amalgamated information reporting, such as the Financial Services-Information Sharing and Analysis Center (FS-ISAC). This organization, protecting worldwide financial assets, holds exercises and training designed to address an industrial security need. It’s my prediction that we’ll see more industrial efforts that amalgamate threat data into what were once mutual protection societies against threats from the open Internet. Hanseatic Leagues of threat protection? It’s in our future.

 

Microsoft's Azure Blob Storage was on the wrong end of a recent hack, illustrating storage systems' vulnerability. Indeed, cybercriminals are at it again, and they are using Microsoft’s Azure Blob Storage service to carry out attacks. Azure Blob Storage is typically used to store and access unstructured data via both HTTP and HTTPS. When it connects through HTTPS, it shows Microsoft’s SSL certificate.

Here’s why that’s important: The cybercriminals are impersonating Microsoft. As a result, some users have encountered a well-known Microsoft Office 365 login field from a decoy website seemingly encrypted with a legitimate Microsoft SSL certificate--and ready to capture the login information entered by unsuspecting users.

“Even seasoned users who have been conditioned to look for the lock icon in the address bar can be tricked into entering their credentials,” said John Whetstone, a research architect for Cloud & Data Center Security at the independent NSS Labs.

They should have looked more closely, argues Tom Coughlin, chairman of the Storage Vision and Creative Storage Conferences. “By looking at the addresses you could see that this wasn't a valid Office 365 page, even though it appeared to have a valid SSL,” he said.

When it comes to the repercussions of such an attack, Whetstone says it can vary significantly.

“Depending on the account that was compromised, damage could range from negligible to catastrophic,” he said. “The compromise could lead to theft of intellectual property, financial records or personally identifiable information [PII]. Once a privileged account has been compromised, the sky is the limit.”

 

Whetstone added that unsecured BLOB storage repositories present real opportunities for bad actors looking to host malware or launch phishing campaigns. In this particular case, the Microsoft Azure storage service was chosen by the threat actor as the distribution mechanism and the certificate was created to conceal the transaction.

Clearly, Coughlin says, black hat hackers are coming up with more sophisticated forms of attacks all the time. With that in mind, it’s more important than ever to scan content carefully.

Preventing these types of attacks also requires more user education and awareness about how to recognize illegitimate URLs. In addition, security teams should arm themselves with security technologies such as cloud access security brokers (CASB), secure web gateways (SWG), multi-factor authentication (MFA) and other forms of cyber threat protection.

Netskope, a cloud security company that has discovered similar schemes, recommends that companies always check the domain of links, be able to identify common object store domains and those used by Azure blob storage, use a real-time visibility and control solution along with multi-layered threat detection and remediation, and keep systems and antivirus up to date.

 

Cybercriminals moved in soon after Hurricane Michael slammed into the coast of the Florida panhandle in October with phishing exploits aimed at stealing email credentials of people who wanted to help in the aftermath of the storm, according to researchers with cybersecurity vendor Proofpoint. The fact that bad actors saw an opportunity in the chaos wrought by the monstrous hurricane--the storm made landfall with 155 mph winds--was not surprising, the researchers said. High-profile events that attract a lot of people and money tend to bring the attention of attackers seeing a chance to steal a lot of money and personal information in a short amount of time. Hurricane Michael fit that bill.

“This is extremely common,” Chris Dawson, threat intelligence lead at Proofpoint, told ITPro Today. “Everything from the Olympics to tax season to the holidays, as well as natural disasters, always turn up in spam, phishing and malware lures. Natural disasters, political campaigns and other events that inspire people to donate money are all prime targets for stealing funds directly or phishing payment information. However, any event that has wide recognition and can create a sense of urgency is a likely candidate for a lure.”

Cybersecurity solution vendors, federal and state government agencies, and organizations from AARP to banks continue to warn people about the threat of online scams in the wake of natural disasters. After Hurricane Florence ravaged the Carolinas in September, the U.S. Computer Emergency Readiness Team (US-CIRT) issued a warning about such scams, urging people to “remain vigilant for malicious cyber activity seeking to exploit interest in Hurricane Florence. Fraudulent emails commonly appear after major natural disasters and often contain links or attachments that direct users to malicious websites. Users should exercise caution in handling any email with a subject line, attachments, or hyperlinks related to the hurricane, even if it appears to originate from a trusted source. … Users should also be wary of fraudulent social media pleas, calls, texts, donation websites, and door-to-door solicitations relating to the hurricane.”

 

AARP similarly has a website for members outlining the different ways bad actors try to take advantage of people eager to donate or help in other ways following a natural disaster, whether it’s a hurricane, tornado, flooding or other events.

“Some of the bogus websites seek your credit card number to collect supposed donations, possibly also using that information later for identity theft,” the organization warned. “Others infect your computer with malware that can ferret out sensitive information, such as your account numbers or passwords.”

There are myriad examples of such scams, including those perpetrated during high-profile sporting events. The Olympics Games earlier this year in South Korea were a target for multiple attackers, including some nation-state groups suspected of being linked to the governments of North Korea and Russia. And organizations routinely put out warnings about fraudulent ticket websites that crop up around such events as the Super Bowl and World Series. This summer, for example, security firms including Kaspersky Lab, Radware and Check Point issued warnings about phishing and other scams surrounding the 2018 FIFA World Cup in Russia. The threats ranged from ticket schemes and data theft to nation-state sponsored cyberattacks.

Politics also is an easy target for scammers. A group of anti-malware researchers known as the Malware Hunter Team in September said it had detected a campaign named for former President Barak Obama that include both ransomware and malware used to mine the Monero cryptocurrency. The group also talked about similar campaigns leveraging the names of President Donald Trump and German Chancellor Angela Merkel.

This summer researchers with Cisco Talos said bad actors who were likely part of the group of North Korean hackers called Group123 launched a spear-phishing campaign that took advantage of the summit between Trump and North Korean leader Kim Jung Un.

So, given all of this, it should not have been surprising that phishing campaigns cropped up during Hurricane Michael. What was unusual, according to Proofpoint analysts, was that, rather than trying to steal credit card numbers through fake donation websites or money through fraudulent donation, many of these campaigns sought to steal credentials--in some cases leveraging the Microsoft Azure cloud to host phishing templates.

“The phishing schemes stand out because the threat actors are directing recipients to credential theft pages for both corporate and personal email rather than credit card or financial theft,” the researchers wrote in a blog post. “This is consistent with dramatic increases we have observed recently in corporate credential phishing. However, this should also serve as a warning for recipients who are accustomed to entering email credentials to log into multiple services. Threat actors are capitalizing on both this desensitization and our desire to do good. While none of these are new tactics on their own, the combination is of interest to defenders and potential victims.”

Proofpoint’s Dawson said anyone looking to donate to a charitable organization or seeking help “should go directly to websites associated with known disaster relief organizations and should never enter webmail or social media credentials to enable donations.”

In addition, people should be wary of unsolicited emails about major events and avoid clicking on links or opening attachments from unknown senders even if they appear to relate to events of interest, he said. Organizations should use layered defenses at the email gateway, network edge and endpoint to protect against malicious content, links and other threats. 

 

Several self-encrypting solid-state drives from two major manufacturers have security flaws that could allow hackers to access data without knowing users’ passwords.

Researchers at Radboud University in the Netherlands found the flaws in Samsung T3 and T5 USB external disks, Samsung 840 EVO and 850 EVO internal hard disks, and Crucial (Micron)’s MX100, MX200 and MX300 internal hard disks. Affected drives protected only by the built-in encryption should be treated as unprotected for the time being, said researcher Bernard van Gastel. 

With self-encrypting drives, users expectdata to remain fully protected unless someone has both the password and the drive itself. But with this flaw, it turns out that only having the drive itself is enough to access all the data in readable form.

“In a technical sense, the encryption key used to make the contents unreadable isn’t related in any way to the password set by the user,” van Gastel explained. “If a malicious person has access to the drive, he canderive the encryption key from internal parts of the drive. Having to set a password gives users a false sense of security, which can be very dangerous.”

To mitigate SSD security problems, the researchers recommend using software encryption in addition to the SSDs' built-in hardware encryption. At the same time, they warn that BitLocker, the encryption built into Microsoft Windows, can complicate the issue. Normally, BitLocker encrypts data in software, but depending on the configuration of the drive and Windows settings, it could disable the encryption in software and enable the built-in encryption in the SSDs. It’s important for storage professionals to know if such switches are taking place because they can severely weaken the protection of the data, depending on the drive used, vab Gastel said.

 

While it’s important for both manufacturers to update their firmware, researchers warn that it won’t completely fix the problem. In theory, van Gastel said, firmware can patch the problem, but issuing the right firmware fixes to actually solve this problem is expensive and difficult. What’s more, users often don’t update their firmware. Instead, van Gastel says, it’s safer not to rely solely on hardware encryption by adding software encryption to the mix. 

“If flaws are discovered in that protection mechanism, the data is basically unprotected, so it’s almost essential to have multiple protection mechanisms in place,” he said.

In addition, van Gastel recommends that organizations expand protection beyond technical solutions. It’s just as important, he said, to be aware of the specific data sets each employee can access and where that data is stored. In light of recent events, he also believes that organizations should think about limiting the type of data allowed on mobile devices. 

Now that the SSD security flaws are out in the open, van Gastel expects hackers to begin exploiting them until Micron and Samsung issue fixes. 

“We only used about $100 worth of equipment and public information to get our results, and a lot can be automated. We won't release such scripts, but past experience with other security issues have [taught us] it won't take too long for such scripts to appear on the Internet,” he said.

The researchers notified both Micron and Samsung about these SSD security flaws back in April and agreed to wait until Nov. 5 to disclose the information to the public. As of today, Samsung has issued an announcement recommending that users install encryption software for non-portable SSDs and update the firmware on portable SSDs. Micron is expected to release a firmware update for the MX300 on Nov. 13.

 

Intuit has sold its Quincy, Washington, data center to H5 Data Centers • The software maker behind TurboTax and QuickBooks has moved most of its core applications to Amazon Web Services • Intuit, which will keep a lot of computing infrastructure in the facility, expects to lose up to $80 million as a result of the sale • H5 is already marketing the massive data center campus to other wholesale tenants

H5 Data Centers has expanded its footprint in the Pacific Northwest with acquisition of a large data center from Intuit, the Silicon Valley company known for its popular business and financial software products, such as QuickBooks and TurboTax.

Recently, Intuit has been moving much of its infrastructure to Amazon Web Services, and the sale of the 240,000-square foot facility in Quincy, Washington, is one of the final steps in that process.

While getting the massive real estate asset off its books, Intuit is leaving a significant amount of computing capacity in the facility. Under a freshly signed agreement, the software company is “leasing back several megawatts” at the site from H5, Jenna Baker, the data center provider’s director of marketing, told Data Center Knowledge.

When enterprises migrate to the cloud, they often leave some computing infrastructure in their own or leased data centers, either because they have legacy applications that can’t be refactored for cloud, or because regulations require that they keep some customer data under their control.

The two companies’ announcements did not say how much H5 agreed to pay for the property, and Baker declined to disclose the number. But Intuit made the deal at a loss.

It expects the sale to result in an operating loss of $75 to $80 million, Intuit said in a statement. Real estate market analyst and DCK contributor Bill Stoller said the most likely explanation for the loss would be that Intuit had carried the asset on its books at up to $80 million over the sale price.

 

“This is one reason that more legacy data center sales have not occurred,” Stoller said. “Management doesn’t want to book a large loss on a huge asset.”

While enterprise data center sale or sale-leaseback deals have picked up in recent years, as more companies move infrastructure to the cloud or into colocation facilities, some of them continue holding on to the physical assets, whose value continues depreciating. “Of course, the longer companies wait, the worse it gets,” Stoller added.

Two Intuit spokespeople did not respond to a request to elaborate on the expected loss.

As is often the case, companies consider many factors besides the immediate cost of their physical data center assets when they decide to move to the cloud. In Intuit’s case, the big benefits AWS provides is elasticity of computing capacity and its developer-friendliness.

“We chose to move to Amazon Web Services (AWS) to accelerate developer productivity and innovation for our customers, and to accommodate spikes in customer usage through the tax season,” H. Tayloe Stansbury, Intuit’s executive VP and CTO, said in a statement. “Our TurboTax Online customers were served entirely from AWS during the latter part of this tax season, and we expect to finish transitioning QuickBooks Online this year.”

Most of Intuit’s core applications are now in AWS, and “the time is right to transition the ownership and operation of this data center to a team who will expertly manage the infrastructure through the remainder of this transition.”

H5 said the data center was “move-in ready” for tenants other than Intuit.

Not all data center capacity the site is able to accommodate has been built out. The campus can provide more than 40MW at full build-out, H5 said.

This is H5’s second location in the region. The wholesale data center provider also has a 300,000-square foot facility in downtown, Seattle. Including Quincy, the company now has 12 locations around the US.

Quincy is primarily a wholesale data center market, home to a massive cluster of Microsoft data centers, as well as computing facilities by Yahoo(which now operates under Verizon’s media brand Oath) and Dell, among others.

Other data center providers in Quincy are Silicon Valley-based Vantage Data Centers and Seattle-based Sabey Data Centers. Dallas-based CyrusOne acquired a large parcel of land in Quincy in 2016, but has yet to build anything on it, waiting until it closes a tenant for the site before it invests in construction.

“We have like 50 acres in Quincy,” CyrusOne CEO Gary Wojtaszek said on an earnings call this May. The company’s been talking to customers about taking the Quincy site and another site in Atlanta down, “but we’re not looking at standing anything up until we get a hard commitment from those customers.”

Quincy is attractive to data center customers because of cheap energy ($0.03 per 1kWh on average, according to H5) generated by hydroelectric plants and because of Washington State’s 100 percent sales and use tax abatement on data center equipment.

 

Microsoft Corp.’s earnings report and forecast cheered investors, providing further evidence the company can increase cloud sales and squeeze more profit from the area while cutting into Amazon.com Inc.’s massive industry lead. The shares rose to a record.

Profit and revenue in the period ended June 30 exceeded analysts’ estimates, as did Microsoft’s projection for cloud sales in the current quarter. Chief Financial Officer Amy Hood pledged that commercial cloud margins would improve overall and for each of the products that make up the area -- Azure, Office 365 and cloud-based customer software.

“The expectation was that margins were going down and that growth would decelerate -- you didn’t hear any of that,” said Mark Moerdler, an analyst at Sanford C. Bernstein & Co., who said he rates the shares “screaming outperform.”

Microsoft’s shares rose as much as 3.6 percent to a record $108.20 in New York Friday. The rally came after Hood unveiled a forecast that envisioned fiscal first-quarter Intelligent Cloud sales of as much as $8.35 billion, compared with an average analyst estimate of $7.95 billion. Even the company’s projection for higher operating expenses and capital spending to build more data centers couldn’t dampen enthusiasm as it was seen as a sign of customer demand for cloud products.

Chief Executive Officer Satya Nadella has been overseeing steady growth in the company’s Azure and Office 365 cloud businesses. Surveys of customer chief information officers by both Morgan Stanley and Sanford C. Bernstein published in the past month show an increase in companies signing up for or planning to use Microsoft’s cloud products. Revenue from cloud-computing platform Azure rose 89 percent in the quarter, while sales of web-based Office 365 software to businesses climbed 38 percent. Microsoft also saw a bump from relative improvements in the corporate personal-computer market, which has been stagnant for years.

“Azure has been hot and Office 365 too, said Dan Morgan, a senior portfolio manager at Synovus Trust, which owns Microsoft shares. “Microsoft has made huge strides and done wonderful things to turn the company around. They were on a death track with hanging everything on the personal computer.”

Stock in the Redmond, Washington-based company rose 8 percent during the quarter, exceeding the 2.9 percent increase in the Standard and Poor’s 500 Index. Shares reached records throughout the period, and have continued to move higher since the quarter’s close.

Profit rose to $8.87 billion, or $1.14 a share in the fiscal fourth quarter, topping the $1.08 average estimate of analysts polled by Bloomberg. Sales climbed 17 percent to $30.1 billion, Microsoft said Thursday in a statement, higher than predictions for $29.2 billion. Annual sales also topped $100 billion for the first time in company history.

Commercial cloud sales rose 53 percent to $6.9 billion, the company said in slides posted on its website. Gross margin for that business widened by 6 percentage points to 58 percent. Microsoft has been posting improved profitability as it adds customers, enabling it to run services more efficiently and spread costs across more clients. With cloud demand rising, Microsoft has also said it will continue to invest. Hood said capital expenditures in the coming fiscal year would increase but at a slowing pace. The company will also boost operating expenses by 7 percent in the fiscal year that started July 1.

During the fourth quarter, the company also agreed to acquire code-sharing website GitHub Inc. for $7.5 billion in stock, aimed at accelerating moves into the cloud and artificial intelligence.

Microsoft’s tally of multimillion-dollar cloud deals was the highest ever in the recent period, Michael Spencer, general manager of investor relations, said in an interview, without providing specifics. Many of those deals included more than one cloud product, he said.

In a Morgan Stanley poll of 100 U.S. and European CIOs, 34 percent of respondents said they planned to buy a more expensive tier of Office 365 software in the next one to two years. Those using or planning to use Azure rose to more than 70 percent. Bernstein found 62 percent of CIOs said they used Azure as of June, up from 50 percent a little more than a year prior. That compares with 60 percent for market leader Amazon Web Services and 23 percent for Google cloud. The most recent survey from Synergy Research Group reported Microsoft gaining share more quickly than Amazon.

Still, the latest quarterly jump in Azure revenue decelerated from the 93 percent growth Microsoft posted in the prior period. Market-share surveys generally show Azure lagging far behind Amazon, which is at least three times bigger by that measure. The two companies are adding new cloud services and duking it out for customers as No. 3 U.S. player Google tries to catch up. Earlier this week, Microsoft said Walmart Inc., an Amazon retail rival, signed a five-year cloud deal involving Azure and Office 365.

Sales of Intelligent Cloud products -- Azure and server software -- rose 23 percent to $9.61 billion, above the $9.07 billion average estimate of four analysts polled by Bloomberg. Productivity software, mainly Office sales, rose 13 percent to $9.67 billion. That compares with the $9.64 billion average estimate.

While Microsoft has reorganized its structure and de-emphasized its Windows PC operating system efforts -- once the company’s flagship business -- corporate sales of the software still generate considerable revenue. That means the company benefited as PC shipments rose last quarter for the first time in six years, owing to strength in the business segment, which helped make up for continued declines among consumers, according to Gartner Inc. In the fourth quarter, revenue in the More Personal Computing unit rose 17 percent to $10.8 billion, compared with a $10.5 billion average estimate.

Surface hardware sales rose 25 percent from a year ago, and gaming revenue increased 39 percent, fueled by demand for third-party titles for the Xbox console. Gaming revenue for the full year topped $10 billion for the first time.

As the U.S. places tariffs on some foreign goods and countries retaliate with taxes of their own, Microsoft is keeping an eye on the issue, Spencer said.

“We operate globally so any time there’s trade wars, so to speak, we don’t like to see them,” he said. “In China we have good partnerships with companies there and are always looking for ways to expand. There’s nothing that’s of concern right now, but we are watching it closely.”

 

Microsoft has been certifying IT pros who use its products for the past few decades. Certification is a great way to force oneself to bear down and master IT products and platforms. A, with many companies hiring and promoting based on Microsoft certification credentials.

The technology world has changed dramatically in the past several years, and the continuing trend of migrating workloads to the cloud and away from on-premises operations has certainly increased the need for every IT pro to become cloud-proficient. While many Azure operations and services mirror computing platforms’ , there's still quite a bevy of tasks that are unique to Azure. So, how should you approach learning about these issues? It’s simple: training and certification.

After , Microsoft has redeveloped its Azure certification program and delivered a new Azure Administrator certification path. Becoming a certified Azure Administrator is a and yet another way to future-proof your career in IT.

Azure Administrator certification involves only two exams:

There is a myriad of preparatory options for obtaining the education necessary for passing these exams, including those offered by Microsoft. One of the best (and FREE!) options is the Microsoft Azure Administrator Learning Path, which includes over 22 hours of training with hands-on labs.

[Any cloud pro worth his or her salt will look to valuable conferences to help hone the skills needed to pass certification exams. Conferences such as IT/Dev Connections 2018 (Oct. 15-18, 2018, in Dallas) provide Azure training delivered by people who work with the product every day and who are willing to share their experiences of the platform in a real customer setting. In addition, IT/Dev Connections attendees are provided multiple opportunities per day to connect directly with one another and with the speakers to resolve open questions and solve existing problems. Sign up for IT/Dev Connections 2018 now.]

 

Hollywood can now compute faster – that is, if it’s computing in Google’s cloud.

The three Google data centers hosting the company’s latest cloud availability region in Los Angeles are now online, the company announced this week.

Marketing it primarily to the area’s media and entertainment industry, Google is promising much better latency if clients in California use the new us-west2 region instead of the previously closest region in The Dalles, Oregon, called us-west1.

As a cloud-user category, media and entertainment “has some workloads that lend themselves very well to cloud,” Dominic Preuss, a Google Cloud product management director, said in an interview with Data Center Knowledge. These users often “have a short-term need for a huge amount of compute.”

A movie, animation, or digital effects studio, for example, may work on a project for a long time before it finally needs the computing muscle to render the final product. Once the rendering job is done, that capacity is no longer needed for a while, making it very expensive to keep in-house.

Media and entertainment has long been one of Google cloud’s strongest verticals, Preuss said. That’s thanks to two acquisitions.

In 2014, Google acquired Zync Render, whose technology enables visual-effects processing in the cloud; in 2016, it acquired Anvato, which enables content companies to stream and monetize video.

Zync and Anvato “definitely have contributed significantly to the fact that we are in the media and entertainment industry,” Preuss said.

 

Local customers cited in the L.A. region launch announcement include visual effects and animation companies Sony Pictures Imageworks, Framestore, and The Mill, as well as the City of Los Angeles.

With the new region, Google is “doubling down on our relationship with all the studios, and creative people, and all the work being done out of the Los Angeles area,” Preuss said.

Los Angeles is Google cloud’s fifth availability region in the US and seventeenth worldwide. Hosting applications there instead of Oregon can improve latency by up to 80 percent for users “across Northern California and the Southwest,” Kirill Tropin, a Google product manager, wrote in a blog post announcing the launch.

As it did with most previous region launches, Google launched the L.A. region with three availability zones, or three data centers. The company hasn’t disclosed where exactly the data centers were, or which data center providers hosted the infrastructure. Preuss declined to provide any additional details on the subject.

Private network connectivity to the cloud region is available at Equinix LA1 and CoreSite LA1 data centers.

Debuting along with the new region is Google Cloud Platform’s new managed file storage service. Designed for applications that don’t support object storage (the most widespread way to store and access data in the cloud) Cloud Filestore is Google’s answer to Amazon Web Services’ Elastic File System and Microsoft’s Azure Files.

Internally, Google has always used object storage, Preuss said, but as it tries to close more business with enterprise customers, it needs to provide features enterprises need, and file storage is one of those features. “There’s a number of these traditional workloads that require distributed file [storage],” he said.

Among those enterprises are many media and entertainment companies in Southern California. To date, Google has relied on partners like the storage giant NetApp to provide file storage capabilities to those customers.

 

But, Preuss explained, the cloud business has now reached a level of scale where customers are increasingly looking for cloud storage provided by Google directly, complete with Google SLAs and access to Google’s enterprise support services, such as Customer Reliability Engineers.

The company isn’t planning to end its relationships with partners like NetApp. Filestore, currently in beta, is simply another option available to customers, he said.

 

Shares of data networking companies including Cisco Systems Inc. tumbled after a report that Amazon.com Inc.’s cloud unit may begin selling its own switching devices.

Amazon Web Service’s plan to enter the $14 billion global market for the equipment that helps shepherd traffic around networks, signals the company may become more entrenched in the enterprise computing marketplace, The Information reported, citing a person with knowledge of the discussions. It would pose a formidable challenge to existing players including Juniper Networks Inc. and Arista Networks Inc. because AWS would price the so-called white-box switches from 70 percent and 80 percent less than compatible ones from Cisco, the news site reported.

Cisco relies on hardware for more than half of its revenue, and switches and routers comprise the two biggest hardware products. Its shares fell as much as 6.1 percent to $40.94 Friday in New York. Juniper Networks fell as much as 3.9 percent to $27.40, and Arista slipped as much as 6.2 percent percent to $261.16.

Every time Amazon makes a move into a new industry, those companies’ stocks shudder as the market considers the consequences on market share and pricing power. Shares of drugstore companies including Walgreens Boots Alliance Inc. and CVS Health Corp. sank almost 9 percent on news in late June that Amazon was acquiring prescription-drug delivery company PillPack. Grocers like Kroger Co. plunged when Amazon acquired Whole Foods Market for more than $13 billion in June 2017.

 

Enterprises are prioritizing performance over cost when it comes to cloud migration, moving mission-critical applications to the public cloud as part of a broader multi-cloud strategy.

A survey of 727 cloud decision-makers released this week by Virtustream said that 60 percent of enterprises are now migrating or have already migrated mission-critical applications to the cloud. Eighty-six percent of respondents describe their current cloud approach as a multi-cloud strategy.

Roberto Mircoli, CTO of Virtustream EMEA said that the findings of the survey, conducted by Forrester Consulting, are in line with what he hears from the enterprise customers he talks to every day in his role. He said this second wave of cloud migration represents a new level of cloud maturity.

“In the previous five to ten years in our industry, the IT industry has been learning about the cloud, learning about the opportunities, the limitations, the best fits for what works with the cloud, and so on,” he said. Now, enterprises are more sophisticated in terms of how they select cloud providers based on the different applications they have.

“That is why there is significant adoption of multi-cloud, considering a multitude of solutions for each business,” he said. “Each of them is best suited to address different workloads and applications you can expect in a typical enterprise environment.”

Virtustream, as part of Dell EMC, has been advocating for a multi-cloud strategy for years, which makes sense since it provides tools and managed services that help enterprises plan cloud migrations across cloud vendors.

Enterprises in the survey said that they prioritize performance over cost, and compliance and security when matching workloads to cloud environments.

“Cloud is no longer considered just a technology discussion or just a selection of technology options, it is really an operating model for organizations,” he said. Due to that, IT pros and the C-suite work together to select vendors and determine a multi-cloud strategy, the survey said.

Nearly three in four enterprises said that they plan to re-evaluate their cloud strategy within the next two years as business objectives shift and factor into cloud technology plans.

Forty-two percent of respondents said that operational efficiency is their top business objective, followed by innovation and growth.

“We have conversations with customers who have a deliberate cloud strategy, a very coordinated, intentional one,” Mircoli said. “But we also have conversations with customers around cloud that have a more incremental, more ad hoc approach. They move from cloud initiative to the next, basically building on top of the outcomes of each. Neither one is good or bad, it’s just different approaches to [the] cloud.”

A multi-cloud strategy helps enterprises improve IT infrastructure management and flexibility (33 percent), improve IT cost management overall (33 percent), and achieve better security and compliance (30 percent).

Nearly half of enterprises with more than 1,000 employees said that they spent at least $50 million annually on cloud projects, a number set to riseover the next two years as the pace of cloud development speeds up.

This growth is reflected in other budget areas as well, according to separate research by McAfee, which found security budgets will shift more spending to securing cloud environments.

Page 1 of 77

Latest Tech News

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…