Deloitte Is the Latest Target of a Cyber Attack With Confidential Client Data at Risk

Global accountancy firm Deloitte has been hit by a sophisticated hack that resulted in a breach of confidential information and plans from some of its biggest clients, Britain’s Guardian newspaper said on Monday.

Deloitte—one of the big four professional services providers—confirmed to the newspaper it had been hit by a hack, but it said only a small number of its clients had been impacted.

The firm discovered the hack in March, according to the Guardian, but the cyber attackers could have had breached its systems as long ago as October or November 2016.

The attack was believed to have been focused on the U.S. operations of the company, which provides auditing, tax advice, and consultancy to multinationals and governments worldwide.

“In response to a cyber incident, Deloitte implemented its comprehensive security protocol and began an intensive and thorough review including mobilizing a team of cybersecurity and confidentiality experts inside and outside of Deloitte,” a spokesman told the newspaper. “As part of the review, Deloitte has been in contact with the very few clients impacted and notified governmental authorities and regulators.”

A Deloitte spokeswoman declined immediate comment, saying that the firm would issue a statement shortly.

Tech

The SEC Hack Shows That Not Even Top Government Data Is Safe

A major computer hack at America’s top stock market regulator is the latest sign that data stored in the highest reaches of the U.S. government remains vulnerable to cyber attacks, despite efforts across multiple presidencies to limit high-profile breaches that are so frequent many consider them routine.

In recent years, nation-state and criminal hackers, as well as rogue employees, have stolen data from the Internal Revenue Service, the State Department and intelligence agencies, including millions of government employee files allegedly exfiltrated by the Chinese military, U.S. officials say.

The Sec urities and Exchange Commission ( SEC ), America’s chief stock market regulator, said on Wednesday that cyber criminals may have used data stolen last year to make money in the stock market, making it the latest federal agency to grab headlines for losing control of its data.

Related

JAPAN-US-IT-FINANCE-BITCOIN -COMPUTERS-HACKING-SERVICES-BANKING

At the same time, being only the latest major breach is not special, said Dan Guido, chief executive of Trail of Bits, which does cyber sec urity consulting for the U.S. government.

“It simply reflects the status quo of our digital sec urity,” said Guido, who is a former member of the cyber sec urity team at the Federal Reserve, America’s central bank.

Central bank officials have detected dozens of cases of cyber breaches, including several in 2012 that were described internally as “espionage.”

The U.S. federal government has sharply increased funding dedicated to protecting its own digital systems over the last several years, attempting to counter what is widely viewed as a worsening national sec urity liability.

But as one of the world’s largest collectors of sensitive information, America’s federal government is a major target for hackers from both the private sec tor and foreign governments.

“When you have one central repository for all this information – man, that’s a target,” said Republican Representative Bill Huizenga, chairman of the House subcommittee on Capital Markets, Sec urities, and Investment, which oversees the SEC .

Last year, U.S. federal, state and local government agencies ranked in last place in cyber sec urity when compared against 17 major private industries, including transportation, retail and healthcare, according to benchmarking firm Sec urityScorecard.

An update of the rankings in August showed the U.S. government had improved to third worst, ahead of only telecommunications and education.

“We also must recognize – in both the public and private sec tors, including the SEC – that there will be intrusions, and that a key component of cyber risk management is resilience and recovery,” said SEC Chairman Jay Clayton.

The federal government audits cyber sec urity measures every year at top agencies, producing reports that routinely expose shortfalls and sometimes major breaches. The Federal Bureau of Investigation also looks for hacking attempts and helped spot an alleged intrusion by Chinese military-backed hackers into a major banking regulator between 2010 and 2013.

Weekly scans of government systems by the Department of Homeland Sec urity showed in January that the SEC had critical cyber sec urity weaknesses but that vulnerabilities were worse at three agencies, including the Environmental Protection Agency, the Department of Health and Human Services and the General Services Administration.

Some agencies said they had improved their cyber sec urity posture since that report.

For more about cybersecurity, see Fortune’s video:

A GSA spokeswoman said the agency has not had any critical vulnerabilities in the past six months, and that the ones identified in January were patched in under 10 days.

A Department of Labor spokesman said all identified vulnerabilities had been fixed and that its systems were not compromised by the identified flaws.

But, he added, “addressing vulnerabilities associated with legacy systems can be challenging.”

Tech

Data center construction increases thanks to the cloud

A new report from a real estate firm that specializes in data center construction and leasing says data center construction in North America is up 43 percent over the same period in 2016, and industry consolidation has driven $ 10 billion in mergers and acquisitions (M&A) so far. 

Jones Lang LaSalle just published its report on the North America data center market, highlighting trends such as consolidation, enterprise hybrid cloud, security, and high-performance computing.

+ Also on Network World: Ireland the best place to set up a data center in the EU +

While construction continues at a record clip, the report also found that absorption of data center space available for lease has returned to normal levels after record leasing in 2016. So many of the cloud providers are still digesting the capacity they picked up last year.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

Enterprises can put Oracle’s entire public cloud in the data center

While Amazon is raking in the lion’s share of money spent by public-cloud users, Oracle is doubling down on its hybrid-cloud strategy, appealing to enterprises that want to put data and applications behind their firewall while taking advantage of cloud pricing models and technology.

Oracle has greatly expanded the services available through its on-premises Cloud at Customer offering so that they are essentially at parity with what the company has on its public cloud. The company announced Tuesday that a broad portfolio of SaaS (software as a service) applications as well as PaaS (platform as a service) and Oracle Big Data Machine services are now available via Cloud at Customer.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

Enterprises can put Oracle’s entire public cloud in the data center

While Amazon is raking in the lion’s share of money spent by public-cloud users, Oracle is doubling down on its hybrid-cloud strategy, appealing to enterprises that want to put data and applications behind their firewall while taking advantage of cloud pricing models and technology.

Oracle has greatly expanded the services available through its on-premises Cloud at Customer offering so that they are essentially at parity with what the company has on its public cloud. The company announced Tuesday that a broad portfolio of SaaS (software as a service) applications as well as PaaS (platform as a service) and Oracle Big Data Machine services are now available via Cloud at Customer.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

IDG Contributor Network: Accelerating Organizational Velocity through a Data Center Autopilot

Understanding the impact of the data center autopilot

Current state of the art and my disappointment with traditional databases aside, I mentioned in my comments last week that the data center autopilot will have big consequences. It seems to me that there is not enough recognition of the likely impact. The tactical observations are that automation will reduce people costs, at least on a per-workload basis, and that automation will:

  • Minimize over-provisioning,
  • Help reduce downtime,
  • Help to manage SLAs, and
  • Improve transparency, governance, auditing and accounting.

That is all true, but it’s not the big story: The overall strategic impact is to significantly accelerate organizational velocity. The acceleration is partly as a result of the above efficiencies, but much more importantly as a consequence of automated decisions being made and implemented orders-of-magnitude faster than manual decisions can be. Aviation autopilots do things that human pilots are not fast enough to do. They are used to stabilize deliberately unstable aircraft such as the Lockheed F-117 Nighthawk at millisecond timescales, and deliver shorter flight times by constantly monitoring hundreds of sensors in real time and optimally exploiting jetstreams.

To read this article in full or to leave a comment, please click here

CIO Cloud Computing

IDG Contributor Network: How to avoid downtime and disruption when moving data

Business Continuity Awareness Week 2017 is here, and hopefully it will present a fresh opportunity to review some of the cloud’s limitations in this area.

Some 60 percent of all enterprise IT workloads will be run in some form of public or private cloud by as soon as next year, according to 451 Research’s latest estimate. It projects particularly strong growth in critical categories, including data analytics and core business applications. Findings from IDC, Gartner and Forrester present broadly the same picture—that the cloud is rapidly becoming central rather than peripheral to general IT provision.

To read this article in full or to leave a comment, please click here

Network World Cloud

Data centers decline as users turn to rented servers

Data centers are declining worldwide both in numbers and square footage, according to IDC — a remarkable change for an industry that has seen booming growth for many years.

Users are consolidating data centers and increasingly renting server power. These two trends are having a major impact on data center space.

The number of data centers worldwide peaked at 8.55 million in 2015, according to IDC. That figure began declining last year, and is expected to drop to an expected 8.4 million this year. By 2021, the research firm expects there to be 7.2 million data centers globally, more than 15 percent fewer than in 2015.

To read this article in full or to leave a comment, please click here

InfoWorld Cloud Computing

Why Google BigQuery excels at BI on big data concurrency

Network World Cloud Computing

How much data can a fiber carry? Facebook and Nokia are pushing it

Facebook and Nokia have found a way to push a lot more data through a submarine cable across the Atlantic, which could help the social network keep up with the growth of video and virtual reality.

On a 5,500-kilometer (3,400-mile) cable between Ireland and New York, the companies tested a new technique developed at Nokia Bell Labs for increasing the efficiency of fiber-optic cables. They say it comes close to the absolute limit for sending bits over a fiber.

Facebook CEO Mark Zuckerberg has said VR is the future of social media. If it is, then the networks that link consumers and data centers will have more data than ever to carry. Higher resolution video also is increasing the burden on networks. For example, Netflix recommends subscribers have at least a 5Mbps broadband connection to stream HD video and 25Mbps for Ultra HD (4K) streams.

To read this article in full or to leave a comment, please click here

Computerworld Cloud Computing

How IBM wants to bring blockchain from Bitcoin to your data center

At its InterConnect conference in Las Vegas this week, IBM is announcing new features for its open source cloud-hosted blockchain service in an attempt to bring this distributed database technology from its initial use of powering Bitcoin to a broader market, including the financial services industry.

Blockchain is a distributed database that maintains a continually growing list of records that can be verified using hashing techniques. Vendors such as IBM and Microsoft are attempting to commercialize it by offering customers a platform for hosting their own implementations. Analysts say the market to do so is just emerging.

IBM has supported blockchain implementations for more than a year, but this week the company is announcing a beta version 1.0 of its service, which is based off the open source Hyperledger Fabric – a Linux Foundation project. It’s available in IBM’s Bluemix Cloud. IBM says Hyperledger can process up to 1,000 transactions per second.

To read this article in full or to leave a comment, please click here

Computerworld Cloud Computing

IDG Contributor Network: Cloud war collateral: What the rise of AWS, Azure has meant for data centers

When Henry Ford introduced the Model T in the fall of 1908, he likely didn’t comprehend the full scope of events he would set in motion. Come 1914, and Ford’s production line had reduced assembly times from 12 hours to less than two and a half hours, slashed the going price of an automobile, and redefined the working wage of factory employees, ultimately putting more than 15 million Model T’s on the road and igniting the entire automotive industry in the years to come.

Competition often leads to innovation and progress for other industry players. One modern equivalent of this can be seen in the rise of public and private cloud providers like Amazon and Microsoft.  AWS’ sales numbers recently topped $ 12 billion, up nearly 55 percent from the same period last year. Meanwhile, Microsoft continues to push ahead and is projected to reach $ 20 billion in annual cloud revenue by June 2018. As these powerhouses and others like Oracle and Google continue to see widespread adoption across industries, other players have stepped in to consume their piece of the $ 204 billion-dollar cloud infrastructure pie, leading to an ecosystem of cloud and data center partners that continue to push the technology envelope to expand capabilities of these offerings. 

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

Google Cloud Search helps enterprise users find data quickly

Google is wooing enterprise customers with the forthcoming launch of a service that will let employees find information they need from multiple sources.

Cloud Search is a new service that will allow users to find content from their company email, cloud storage and directory. Directory lookup provides users not only with their colleagues’ contact details, but also information about shared files and calendar events. More than that, Cloud Search is also built to proactively help users access information they need.

When users log into Cloud Search either on the web or on their Android device, they’ll be greeted by “assist cards” that are supposed to highlight key files. At launch, those cards are built to show users files that are relevant for their upcoming calendar events, as well as those that require attention based on recent edits.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

Microsoft’s standing to sue over secret US data requests in question

Microsoft’s lawsuit objecting to the indiscriminate use by U.S. law enforcement of orders that demand user data without the opportunity to inform the customer may run into questions about the software giant’s standing to raise the issue on behalf of its customers.

A government motion to dismiss Microsoft’s complaint comes up for oral arguments Monday and significantly the judge said on Thursday that the issue of whether Fourth Amendment rights are personal or can be “vicariously” asserted by third-parties on behalf of their customers would have to be addressed by both sides. The Fourth Amendment to the U.S. Constitution prohibits unreasonable searches and seizure of property.

To read this article in full or to leave a comment, please click here

CIO Cloud Computing

Microsoft’s standing to sue over secret US data requests in question

Microsoft’s lawsuit objecting to the indiscriminate use by U.S. law enforcement of orders that demand user data without the opportunity to inform the customer may run into questions about the software giant’s standing to raise the issue on behalf of its customers.

A government motion to dismiss Microsoft’s complaint comes up for oral arguments Monday and significantly the judge said on Thursday that the issue of whether Fourth Amendment rights are personal or can be “vicariously” asserted by third-parties on behalf of their customers would have to be addressed by both sides. The Fourth Amendment to the U.S. Constitution prohibits unreasonable searches and seizure of property.

To read this article in full or to leave a comment, please click here

InfoWorld Cloud Computing

Businesses eye cloud for big data deployments

As 2016 draws to a close, a new study suggests big data is growing in maturity and surging in the cloud.

AtScale, which specializes in BI on Hadoop using OLAP-like cubes, recently conducted a survey of more than 2,550 big data professionals at 1,400 companies across 77 countries. The survey was conducted in conjunction with Cloudera, Hortonworks, MapR, Cognizant, Trifacta and Tableau.

AtScale’s 2016 Big Data Maturity Survey found that nearly 70 percent of respondents have been using big data for more than a year (compared with 59 percent last year). Seventy-six percent of respondents are using Hadoop today, and 73 percent say they are now using Hadoop in production (compared with 65 percent last year). Additionally, 74 percent have more than 10 Hadoop nodes and 20 percent 20 percent have more than 100 nodes.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

Businesses eye cloud for big data deployments

As 2016 draws to a close, a new study suggests big data is growing in maturity and surging in the cloud.

AtScale, which specializes in BI on Hadoop using OLAP-like cubes, recently conducted a survey of more than 2,550 big data professionals at 1,400 companies across 77 countries. The survey was conducted in conjunction with Cloudera, Hortonworks, MapR, Cognizant, Trifacta and Tableau.

AtScale’s 2016 Big Data Maturity Survey found that nearly 70 percent of respondents have been using big data for more than a year (compared with 59 percent last year). Seventy-six percent of respondents are using Hadoop today, and 73 percent say they are now using Hadoop in production (compared with 65 percent last year). Additionally, 74 percent have more than 10 Hadoop nodes and 20 percent 20 percent have more than 100 nodes.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

MariaDB adds support for big data analytics

MariaDB today moved to unite transactional and analytical processing in a single relational database with the announcement of the general availability of its open source MariaDB ColumnStore 1.0.

“What we’re offering is a single SQL interface for both OLTP and analytics,” says David Thompson, vice president of Engineering at MariaDB.

“MariaDB ColumnStore is the future of data warehousing,” Aziz Vahora, head of Data Management at Pinger, a specialist in mobile communication apps, added in a statement today. “ColumnStore allows us to store more data and analyze it faster. Every day, Pinger’s mobile applications process millions of text messages and phone calls. We also process more than 1.5 billion rows of logs per day. Analytic scalability and performance is critical to our business. MariaDB’s ColumnStore manages massive amounts of data and will scale with Pinger as we grow.”

To read this article in full or to leave a comment, please click here

CIO Cloud Computing

Skyhigh Networks adds threat protection and data loss prevention capabilities to the cloud  

This column is available in a weekly newsletter called IT Best Practices.  Click here to subscribe.  

Every time I read the quarterly Cloud Adoption & Risk Report published by Skyhigh Networks, I come across some tidbit of information that truly surprises me. What is it in the Q4 2016 report that has me so astounded? Consider this: Fewer than half (42%) of cloud providers explicitly specify that customers own the data they upload to the service. The rest of the providers either claim ownership over all data uploaded, or don’t refer to data ownership at all in their terms and conditions, leaving it open to controversy if service is discontinued.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

ZoneSavvy taps big data to help SMBs find best sites for businesses

Location, location, location: As the old joke goes, those are the three keys to business success. Now, with big data analysis, corporations can be smarter than ever before about where to open up new offices or businesses.

But what if you run a mom-and-pop shop, or you’re dreaming of quitting your corporate job and opening a boutique? Even medium-size businesses do not have the money to spend on the sort of systems and analysis teams that corporate behemoths use to locate new businesses.

This is where ZoneSavvy, a new website created by software engineer Mike Wertheim, could help. The site is straightforward: You enter a business type, the ZIP code of the general area where you want to locate the business, and the distance from that ZIP code you are willing to consider. ZoneSavvy then gives you suggestions for which nearby neighborhoods would be the best locations for your business.

To read this article in full or to leave a comment, please click here

Computerworld Cloud Computing

ZoneSavvy taps big data to help SMBs find best sites for businesses

Location, location, location: As the old joke goes, those are the three keys to business success. Now, with big data analysis, corporations can be smarter than ever before about where to open up new offices or businesses.

But what if you run a mom-and-pop shop, or you’re dreaming of quitting your corporate job and opening a boutique? Even medium-size businesses do not have the money to spend on the sort of systems and analysis teams that corporate behemoths use to locate new businesses.

This is where ZoneSavvy, a new website created by software engineer Mike Wertheim, could help. The site is straightforward: You enter a business type, the ZIP code of the general area where you want to locate the business, and the distance from that ZIP code you are willing to consider. ZoneSavvy then gives you suggestions for which nearby neighborhoods would be the best locations for your business.

To read this article in full or to leave a comment, please click here

Network World Cloud Computing

AeroVironment’s Quantix drone is all about the data

In the age of technology, businesses are all chasing efficiency. That’s exactly what AeroVironment promises to deliver with its new Quantix drone.

The technology, a combination of a drone and cloud-based analysis service, can be useful for farmers, says Steve Gitlin, vice president of corporate strategy at AeroVironment.

“In many cases, farmers rely on themselves or their people to walk the fields, and if they’re managing large fields in excess of 100 acres or so, then it’s very difficult to walk the entire field in any given unit of time. So they have to rely on their deep experience and sampling.”

To read this article in full or to leave a comment, please click here

InfoWorld Cloud Computing

IBM’s Watson to use genomic data to defeat drug-resistant cancers

IBM’s Watson artificial intelligence platform has joined forces with researchers at MIT and Harvard to study how thousands of cancers mutate to become resistant to drug treatments that initially worked to beat back the disease.

By discovering how cancers adapt to overcome drug therapies, researchers at MIT’s and Harvard’s Broad Institute genomics research center hope to develop a new generation of therapies that cancers cannot circumvent.

While a growing number of treatments can hold cancers in check for months or years, most cancers eventually recur, according to the Broad Institute researchers. This is in part because tumors acquire mutations that make them drug resistant.

To read this article in full or to leave a comment, please click here

Computerworld Cloud Computing