The market for containers – which are used by developers to package applications – is “exploding” with a slew of startups looking to compete with legacy vendors who are increasingly prioritizing the technology.
Despite more and more companies outsourcing workloads to the public cloud, legacy technology stalwarts Cisco and HPE remain the most popular enterprise infrastructure vendors, new estimates from Synergy Research suggest.
Synergy tracked enterprise infrastructure spending across seven categories for the 12 months leading up to the end of Q3 2016: Data center servers; switches & routers; network security; voice systems, WLAN; UC Apps and telepresence. In aggregate it estimates revenues were $ 88 billion across these segments, with spending down about 1% from the same time period in 2015.
+MORE AT NETWORK WORLD: This company is transferring 50 Petabytes of data to Amazon’s cloud +
After more than half a decade in the making, ARM server chips should have struck gold by now, but they haven’t. ARM servers were projected to be approaching a double-digit server market share, but they still are virtually non-existent.
Keeping market realities in mind, Qualcomm earlier this year said it would take a wait-and-watch approach before making a splash with its ARM server chips. While ARM servers adoption has been poor, Qualcomm decided to go ahead and launch the chips.
In the making for two years, Qualcomm’s Centriq 2400 server chips have 48 cores and are now being sent as samples to companies. Volume shipments of the chips will start in the second half next year.
Companies signed IT and business process outsourcing deals worth a record $ 9.5 billion in annual contract value, according to the quarterly outsourcing index produced Information Services Group (ISG). Traditional outsourcing contracts were up 5 percent to $ 5.8 billion, while the fast-growing as-a-service segment leapt 20 percent to $ 3.7 billion, according to ISG.
“Most conversations we’re having with clients are cloud-led,” says John Keppel, ISG president, for Europe, Middle East and Africa. “Enterprise customers are first seeking to be more cost efficient so that they can redeploy funds into digital and cloud activities. But we’re also seeing them pivot quickly to broader digital transformation initiatives that are focused on creating a customer-first environment that is intelligent and mobile. This is a very long game and we are in the early stages at this point.”
The accounting software industry is, contrary to what you might think, a pretty interesting place. Characterized by three large vendors that hold the lion’s share of the market in their respective geographies, until recently it has been a fairly sedentary place with Intuit (U.S.), Sage (U.K.) and MYOB (Australasia) happy to organically grow their businesses. That all changed around 10 years ago when Xero, a New Zealand-based startup, came on the scene and started taking well-aimed kicks at the three sleeping bears.
Since then, Xero has gone on to take significant market share in its home market of Australasia, pretty positive market share in the U.K., and is trying its hardest against a newly invigorated competitor to gain a toehold in the all-important U.S. market. That isn’t proving quite as easy as in Australasia and the U.K., due to some structural and competitive issues and also due to the fact that Intuit is doing a fantastic job (at last) of innovating and doing what it needs to do to keep market share.
SAN DIEGO – Cisco this week is throwing its hat into the hyperconvergence and software-defined storage ring with a system co-developed with software company SpringPath.
Cisco is also rolling out at its Cisco Partner Summit here a new generation of Nexus 9000 data center switches featuring 25G/50G Ethernet based on custom ASICs. The new products dovetail with Cisco’s acquisition today of CliQr, a maker of “application-defined” hybrid cloud orchestration software for deploying and managing applications across bare metal, virtualized and container environments.
Less than a week after Hewlett-Packard unceremoniously withdrew from the IaaS public cloud market, another legacy IT stalwart has jumped in.
In a grandiose display at the company’s OpenWorld conference in San Francisco, Oracle executives, led by czar/Chairman Larry Ellison made an ambitious, weeklong pitch for the company’s cloud. It stretches across all types of cloud options – from private on-premises IaaS systems, to an elastically scalable public cloud, plus a suite of SaaS and application development PaaS components.
Sinclair Schuller is the CEO and cofounder of Apprenda, a leader in enterprise Platform as a Service.
When the phrase “hybrid cloud” is mentioned, some technologists will tell you it is the eventual end state of cloud computing, while other technologists chuckle. Those that chuckle typically view hybrid as a phrase used by vendors and customers who have no cloud strategy at all. But hybrid is real and here to stay. Not only is it here to stay, but the hybrid cloud will also reshape cloud computing forever.
People today imagine public cloud to be an “amorphous, infinitely scalable computing ether.” They think moving to the cloud rids themselves of the need to detail with computing specificity and that cloud makes you location, risk and model independent. They think enterprises that move to the cloud no longer need to depend on pesky IT departments and deal with risks associated with centralized computing. This perception of computing independence and scale couldn’t be further from the truth.
The promise of cloud is one where anyone who needs compute and storage can get it in an available, as-needed, and robust manner. Cloud computing providers have perfected availability to the point where, even with occasional mass outages, they outperform the service-level agreements (SLAs) of internal IT departments. This does come at a cost, however.
Cloud computing is arguably the largest centralization of technology the world has ever seen and will see. For whatever reason, many people don’t immediately realize that the cloud is centralized, something that should be heavily scrutinized. Possibly because the marketing behind cloud can be vague and lacking a description of a tangible “place.” Don’t be fooled.
When an enterprise selects a cloud vendor, they’re committing to that provider in a meaningful way. As applications are built for or migrated to a cloud, switching cost gets very high. The nature of this market is driven by a network effect where, assuming all else is equal, each prospective customer of a cloud provider (AWS, Microsoft, etc.) benefits by consuming a cloud that has many of customers over one that has fewer since it indicates lower risk and helps drive the economies that make a given cloud attractive.
If we play this future out, we’ll likely see the cloud infrastructure market collapse to just a few massive, global providers. This will partly be driven by success of the individual providers and the consolidation of smaller players who have great technology but simply can’t compete at that scale. Just take a look at the acquisition of Virtustream by EMC just prior to Dell’s acquisition of EMC for a recent example.
A look at recent market share estimates show exactly that, with Amazon, Microsoft, IBM, and Google accounting for 50 percent of the global cloud infrastructure market. One day, these five vendors will likely account for 80 percent of the market. Compare that to the often-criticized banking world, where despite the massive size of today’s banks, the list of banks that hold 50 percent of global deposits is much longer than just five banks. If we applied the same standard to cloud computing, we’d certainly be infuriated and demanding that these “too big to fail” computing providers be broken up.
To be clear, I’m not suggesting that what’s happening is bad or that public cloud is bad, but rather to point out the realistic state of cloud computing and the risk created by centralizing control to just a few providers. Cloud would likely never have succeeded without a few key companies making massive bets. The idea of a truly decentralized, global cloud would likely have been the wrong starting point.
Let’s explore the idea that a global decentralized cloud, or something more decentralized than what we have now, is the likely end state. Breaking up cloud providers isn’t necessary or optimal. Unlike banking, technology is capable of layers of abstraction to mitigate these sorts of centralized risks.
Most large enterprises looking to adopt cloud are making two large considerations in their decision process:
- They can’t shut down their entire IT department and replace it with cloud. There are many practical reasons why this is unlikely.
- Many are keenly aware of the risks associated with depending on a single vendor for all their cloud computing needs.
The first consideration makes it difficult to adopt a public cloud without at least considering how to reconcile the differences with on-premises, and the second makes it difficult to choose one provider at a level that is incompatible with another provider. The result of centralization in public cloud providers and looking for symmetry between off- and on-premises computing strategies is driving enterprises to explore (and in some cases demand) hybrid capabilities in layers that abstract away infrastructure. In fact, hybrid has transformed to be synonymous with multi-cloud.
Technical layers like enterprise PaaS software platforms and Cloud Management Platforms have evolved to allow for multi-cloud capabilities to cater to a modality where resources are abstract. Over the coming years, we’ll likely see multi-cloud features in these technology layers to lead to a much more decentralized computing model where something like a PaaS layer will fuse resources from public clouds, on-premises infrastructure, and regional infrastructure providers into logical clouds.
At least in the enterprise space, “private clouds” will really be an amalgam of resources and will behave as the single, “amorphous ether” that we tend to assign cloud to begin with. The cloud market will not be one where five vendors control all the compute and customers are at the mercy of the vendors. Instead, cloud will be consumed through multi-cloud layers that will protect customers from inherent centralization risk. The end state is a decentralized model with control points owned by the customer through software – a drastic reshaping to say the least.
Related research and analysis from Gigaom Research:
Subscriber content. Sign up for a free trial.
- Gigaom Research survey: attitudes and adoption of cloud technology by strategic buyers
- Why public cloud providers need PaaS and how they can get it
- A field guide to web APIs
The initiative of Cisco Intercloud, a worldwide network consisting of interconnected clouds that the corporation is building along with its partners, has grown now. The networking giant announced significant developments in the Intercloud initiative, which aims to connect the hybrid cloud to being part of a large available and accessible network from anywhere.
During this year’s Cisco Live! and media level, the Intercloud initiative has been overtaken undoubtedly the concept of Internet of Everything. However, for the manufacturer it is a vital part of the technology that will develop the connection of all things, data collection, and processing.
Cisco also announced the addition of 35 new members to accelerate the creation of innovative cloud-based services through three fundamental areas -Platforms development of next-generation analytics and big data and cloud services for the Internet of Everything. The company has also optimized its Cisco Intercloud Fabric solution with new security features, support management in clouds and additional hypervisor. These innovations further eliminate the complexity of hybrid cloud providing flexible movement of workloads and maintaining security policies and network environments through public and private cloud.
Cloud services for the Internet of Things
Cisco and its partners offer organizations’ cloud services and next-generation applications through the Cisco Intercloud Marketplace, a global market focused on partners that Cisco plans to open this fall. Developers are going to rely on the Cloud for development environments/test to create and distribute applications in production. Cisco announced its collaboration with various companies developing and delivering business applications such as Apprenda, Active State and Docker for innovative development environments.
Cisco is also expanding its participation in major open source development communities such as Cloud Foundry, OpenShift, and kubernetes, and is now building an integrated suite to help developers design micro-container based services tools.
Organizations are demanding new ways to manage the exponential growth of data and the ability to obtain real-time analysis. To meet this need, Cisco collaborates with leading Big Data solutions such as MapR, Hortonworks, Cloudera and Apache Hadoop community. Working with these partners, Cisco safely extends Hadoop solutions on-premise to the cloud and provide a true hybrid deployment. It is also providing end customers to maintain the same policies, control and security in their Big Data implementations, as well as greater flexibility and an unlimited virtual scalability.
In addition to developing platforms and powerful features of Big Data and analytics necessary to the IEA, Cisco started providing APIs to the development community to ensure functionality control, performance and security from the data center to the device.
As part of this framework, Cisco will expose APIs for application developers to allow network monitoring, performance and security to be delivered from the data center to the device. It will also be offering vital services such as data virtualization, Energywise, and Cisco Exchange Platform Services through Intercloud.
Cisco says that by 2020 there will be over 50 billion devices connected to the Internet. Cisco is working on a number of fronts to turn IoT’s many, many possibilities into reality. Cisco’s strategy to invest in solutions of hybrid data centers, including Intercloud and fog computing to create an optimized IoT infrastructure.
Washington DC (PRWEB) June 19, 2015
It is an open secret that the economic prosperity and quality of life of communities, towns and cities depend on the mitigation of crime, terror, man-made and natural disasters. Maturing technologies and changing public opinion lead to major shifts in the Safe Cities Global Market during the forecast period.
According to the Global Safe City: Industry, Technologies & Market – 2015-2020 report, the market growth is boosted by the following drivers:
Cities, towns and communities population drive for quality of life and economic prosperity
Post 2008 meltdown governmental funding policy of modern infrastructure
Advancements in cost-performance of surveillance sensors and ICT technologies
Urbanization in Asia Pacific and Latin America
Worsening of radical Islamists terror threats
The growing rate and damage of natural disasters
The growing understanding that global warming entails growth in natural disasters
Growing aftersale revenues
The voting citizens expectations of safety from their local elected politicians
The report examines each dollar spent in the market via 2 orthogonal money trails: regional / national markets, and technology markets.
This “Global Safe City: Industry, Technologies & Market – 2015-2020” report is a resource for executives with interests in the industry. It has been explicitly customized for industry and urban decision makers to identify business opportunities, developing technologies, market trends and risks, as well as to benchmark business plans
Questions answered in this 2-volume 650-page report include:
1. What will the Safe City market size be in 2015-2020?
2. What are the main Safe City technology trends?
3. Where and what are the Safe City market opportunities?
4. What are the Safe City market drivers and inhibitors?
5. Who are the key Safe City vendors?
6. What are the challenges to the Safe City market penetration?
The “Global Safe City: Industry, Technologies & Market – 2015-2020” report presents in 650 pages, 97 tables and 145 figures, analysis of dozens of current and pipeline technologies and 78 leading vendors. The report is granulated into 150 vertical and horizontal submarkets, and presents for each submarket 2013-2014 data, analyses, and projects the 2015-2020 market and technologies from several perspectives, including:
Business opportunities and challenges
Market analysis (e.g., market dynamics, market drivers and inhibitors)
Physical Security Information Management (PSIM)
Public-Safety Answering Point (PSAP)
Distributed Sensors Systems, Sensor and Data Fusion Algorithms, Wireless Sensor Networks
Safe City Software as a Service (SaaS)
Social Media Emergency Response Software
Geographic Information Systems (GIS)
Location Based Emergency Mass Notification Systems (EMNS), Safe City Cell Broadcast, Cell Broadcast Technologies
Managed Security Services (MSS), Safe City Consulting, Remote Management, Managed Security Monitoring
Safe City Communication, City-Wide Communication Interoperability
Video Surveillance, Analog Video Surveillance, Second-Generation Analog Video, Surveillance, Third-Generation Video Surveillance, Digital Video Surveillance, IP Surveillance Cameras, IP-Based Video, Surveillance Systems
Safe City Video Analytics Technologies, Cloud Platforms, Video Analytics Based Suspect Behavioral, Analysis, Video Surveillance as a Service (VSaaS), Video Surveillance as Service Solutions: Vendors, Real Time Automatic Alerts Software, Image Segmentation Software, Item Tracking Video Analytics Software, Object Sorting and ID, Item Identification and Recognition, Multi-Camera Intelligent Video Surveillance Systems, Video Content Analysis, Item Detection, Gaussian Mixture Based Software, Background Subtraction, Item Detection Using a Single-image Software, Item Tracking Software, Kalman Filtering Techniques, Region Segmentation, Kalman Filters Application to Track Moving Items, Partially Observable Markov Decision Process, Intelligent Video Surveillance Systems, “Splitting” Items Algorithms, Dimension Based Items Classifiers, Shape Based Item Classifiers, Event Detection Methods, Vision-based Human Action Recognition, Video Derived Egomotion, Path Reconstruction Software, Video Cameras Spatial Gap Mitigation Software, Networked Cameras Tag and Track Software, Visual Intelligence Technologies, Visual Processing, Fusion Engine, Video Analytics Challenges
Standoff Video Analytics Based Biometrics, Video Surveillance Based Behavioral Profiling, Video Based Biometric Recognition Technologies, Video Based Face Recognition, Remote Biometric Identification Technologies, Fused Intelligent Video Surveillance & Watch Lists, Crowd and Riot Surveillance, Wireless Video Analytics, Cloud Video Analytics, Online Video Analytics, Pulse Video Analytics, Smart Cameras
Physical Identity and Access Management (PIAM)
Safe City Natural Disasters Mitigation & Management, Emergency Management systems
Communication Interoperability, Perimeter Security, Public Events Emergency Services, WMD and Hazmat Detection
Cloud Computing, Data Mining & Analytics
Command & Control Systems
Gunshot Location Technologies, Optical Gunshot Location Technologies, Fused Optical and Acoustic Gunshot Detection, Detection of Gunshot Signature: Artificial Neural Networks
Emergency Transportation Management Systems, Intelligent Transport Technologies, License Plate Recognition (LPR), Inductive Loop Detection, Video Vehicle Detection, Smart Transportation Security, Emergency Vehicle Notification Systems
Companies operating in the market: 3I-MIND, 3VR, ABB, Accenture, ACTi Corporation, ADT Security Services, Agent Video Intelligence, AGT international, ALPHAOPEN, Anixter, Aralia System, AT&T Inc., Augusta Systems, Avigilon Corporation, Axis, AxxonSoft, BAE Systems, Bosch Security Systems, BT, Camero, Cassidian, CelPlan, China Security & Surveillance Inc., Cisco, Citilog, Computer Network Limited (CNL), Diebold, DVTel, Elsag Datamat, Emerson Electric, Ericsson, Firetide, GS, General Electric, Hexagon AB, Honeywell, IBM, IndigoVision, Intel Security, IntuVision Inc, iOmniscient, IPConfigure, IPS Intelligent Video Analytics, ISS, MACROSCOP, MDS, Mer group, Milestone Systems A/S, Mirasys, National Instruments, NICE Systems, Northrop Grumman Corporation, ObjectVideo, Orsus, Panasonic, Pelco, Pivot, Proximex, Raytheon Company, Salient Stills, Samsung Techwin, Schneider Electric, SeeTec, Siemens, Smart China (Holdings) Limited, Sony, Synectics Plc, Tandu Technologies & Security Systems Ltd, Thales Group, Total Recall, Unisys, Verint, Vialogy LLC, Vigilant Technology, xLOGIC, Zhejiang Dahua Technology
Homeland Security Research Corp. (HSRC) is an international market and technology research firm specializing in the Homeland Security (HLS) & Public Safety (PS) Industry. HSRC provides premium market reports on present and emerging technologies and industry expertise, enabling global clients to gain time-critical insight into business opportunities. HSRC’s clients include U.S. Congress, DHS, U.S. Army, U.S. Navy, NATO, DOD, DOT, GAO, and EU, among others; as well as HLS & PS government agencies in Japan, Korea, Taiwan, Israel, Canada, UK, Germany, Australia, Sweden, Finland, Singapore. With over 750 private sector clients (72% repeat customers), including major defense and security contractors, and Fortune 500 companies, HSRC earned the reputation as the industry’s Gold Standard for HLS & PS market reports.
Related Cloud Press Releases
It is now possible to find pepper sprays from hardware stores, firearms shops, and also from the supermarkets because it depends on which state you are in. Some can be bought on the internet.
These items are made from some chemical called Oleoresin capsicum which is a product of cayenne pepper. It is possible to get some pepper sprays that do not have very active OC concentration because most times the one that is active will be up to 5%.
Finding the one with perfect concentration is not easy because either is too high or too low for it to work properly.
Now there is a pepper spray with just the right amount of concentration, at 10 percent concentration which is perfect. It is called the Mace, and is manufactured by the company that has been producing self defense items for years. Mace is quite active and at 10 percent, will succeed in immobilizing the attacker in seconds.
Get to know the state laws dealing with this matter since there are some states which do not allow the mace to be carried; neither openly nor hidden. Then there are those who want you to carry a permit for it.
Some states put restrictions on the type of pepper spray that can be allowed, others require that one gets a permit but others also insist on people going to classes. It is not strange at all to find that some states do not even allow its use. For you to know exactly what the statutes are in your state, you will have to contact the local police station in order to know what is expected.
You will also note that all the pepper sprays have protection features that prevent accidental discharge as well as belt loops for easy carriage and use. Its serial number is for tracking quality.
These products go for less than $20 in most cases which is a fair rate for anyone thinking of personal safety and tranquility of mind.