make
minds
machines
industrial-internet
big-data
data-analytics
data
iot
internet-of-things
innovation
connectivity
verizon
wireless
mobile
telecom
machine-to-machine
m2m
security
ge
infrastructure
mark-bartolomeo
bartolomeo
staff
qa
university-of-michigan
minds-machines
fixed-feature

How Wireless Makes the Industrial Internet Work — Q&A With Mark Bartolomeo

10.10.2014
Most Recent
make
cable
internet
broadband
innovation
tech
global-competitiveness
us
uk
tv
streaming
entertainment
consumers
cord-cutters
telecom
pew-research
ncta
snl-kagan
bt
europe
virgin-media
comcast
wi-fi
france
bouygues-telecom
netflix
time-warner
hbo
richard-plepler
Creating Value at High Internet Speeds
10.17.2014

It has been 66 years since John Walson Sr. invented cable TV in America. Today, about 100 million U.S. households pay for TV, according to research firm SNL Kagan, or approximately 85 percent of all households in the country.

Brulte broadband hero

Despite such penetration, the pay TV market is actually in decline, experiencing its first full-year drop in subscribers last year amid competition from satellite and mobile. To capture new subscribers and stop losses from cord-cutters, cable and telecom companies should rethink their business models and focus on building frictionless user experiences that create value for customers.

The answer may be found online. Only 70 percent of American households have an active high-speed Internet connection at home, according to Pew Research. Meanwhile, 93 percent of all American households have access to broadband Internet, according to an NCTA analysis of SNL Kagan and Census Bureau estimates. That gap points to potential for growth.

In the early days of cable, value was created through providing hundreds of channels in the comfort of your home. Today, cable TV is not the value — the value is high-speed Internet access. While broadband speeds have increased for consumers, and revenue has increased for cable and telecom providers, the value of the service has not kept pace.

Value for subscribers can be created by boosting broadband speeds and offering value-added services to the network infrastructure — an emerging business opportunity that cable and telecom companies in the U.K. are starting to explore.

In the U.K., BT has strategically decided to offer complimentary access to BT Sport for free to all existing broadband customers, in order to gain new subscribers and market share. BT has about a 31 percent market share of the broadband market, which totals more than 22 million broadband subscribers, or 78 percent of all households.

Not to be outdone, Virgin Media, with about a 20 percent market share, has focused on complementary international Wi-Fi access for subscribers. Through a Wi-Fi sharing access agreement with Comcast, Virgin Media subscribers will be able to access Comcast’s Wi-Fi network when they travel to the U.S. Comcast is on track to have more than 8 million Wi-Fi hotspots live by January 2015.

Over in France, Bouygues Telecom — whose 2 million customers translates into about an 8 percent market share — recently announced a deal to give subscribers access to Netflix through their set-top TV boxes.Bouygues made the strategic and symbolic move to embrace Netflix, while creating value for their subscribers. After the deal, Bouygues Chairman Olivier Roussat said, “We intend to continue enhancing our offer so that customers can enjoy the best innovative content.”

As Roussat and his counterparts in Europe explore new business opportunities, U.S. executives are starting to take notice. Time Warner has just announced that HBO would unbundle HBO and offer it as a stand-alone service. “It is time to remove all barriers to those who want HBO,” said Richard Plepler, chairman and chief executive of HBO, in announcing the move.

As the price of broadband Internet access continues to rise, cable and telecom executives in the U.S. would be smart to follow the lead of their European counterparts and create value for subscribers by offering complimentary value-added services before Apple CEO Tim Cook does it for them.

Cook recently described the current state of TV as “stuck back in the 70s.” Cable and telecom executives should consider the Cook’s statement a shot across the bow. To anticipate a potential move by Apple into the TV business, they should double down on innovation that creates value for subscribers.

For inspiration on the future of TV, look to Europe. The TV/broadband relationship is changing to the benefit of consumers. Cable and telecom companies who put their subscribers first and create value through complementary value-added services will likely be the ultimate winners, and in turn gain revenue and market share.

Grayson Brulte is the Co-Founder & President of Brulte & Company, an innovation advisory and consulting company that designs innovation and technology strategies for a global marketplace.

 

power
fixed-feature
energy
utilities
clean-energy
renewables
renewable-energy
big-data
data-analytics
industrial-internet
internet-of-things
iot
software
electricity
india
mexico
environment
climate-change
epa
environmental-protection-agency
federal-energy-regulatory-commission
ferc
world-bank
oklahoma-gas-electric
africa
austin-energy
brazil
amit-narayan
narayan
How Data Will Power the Future of Energy
10.16.2014

Throughout history, we’ve equated energy with the consumption of natural resources such as oil, natural gas or coal.

AutoGrid loop

In the coming decades we will start to think of data and software as a source of energy.

What do I mean by that? Software won’t generate electrons, but it will let us leverage the electricity we are already generating in a more efficient and productive way.

Instead of building new power plants, we will take advantage of the search engine-like analytics of software-based controls to lower emissions and reduce costs. In other words, data will enable us to use power wisely and proactively.

The situation is dire. Over 1.3 billion people around the world who are still not connected to the grid are clamoring for cheap and reliable electricity. In India, Prime Minister Narendra Modi has promised to bring at least one light bulb in every home over the next 5 years. The demand for energy will continue to grow with the rise in living standards around the world. In 1995, only 10 percent of the homes in Mexico had an AC. Now, more than 80 percent do.

Building new power plants to meet all of this new demand would be a financial nightmare and environmental catastrophe. We will need an intelligent use of data to more effectively utilize our existing infrastructure, to lower the cost and environmental impact, and use those savings to bring electricity to more people around the world.

Look at the glittering skyline when you go home tonight. Who is working late inside those well-lit offices? Probably very few people. The EPA in fact estimates that more than 30 percent of the energy used inside buildings is wasted through activities like lighting and cooling empty rooms. Sensors and software linked to intelligent building management systems can unobtrusively control power consumption, saving businesses and individuals money without them even noticing.

The extra energy is there: the U.S. wastes enough energy every year to power the U.K. for seven years. FERC, the Federal Energy Regulatory Commission, in the U.S., has estimated that the U.S. could avoid building 188 GW of power plants through dynamic peak power controls. Just like airlines use big data systems to optimize flight patterns and schedules, utilities and their customers can orchestrate a complex, multi-layered flow of electrons so that everyone benefits.

Analytics will also prevent power theft. Power theft sounds like an obscure crime, but it isn’t. The World Bank estimates that $88 billion worth of power gets pilfered annually. Electricity is actually the third most stolen commodity in the United States, outpaced only by jewelry and cars. In many parts of Brazil, India and Africa, energy theft regularly exceeds 30 percent, leading to safety hazards, blackouts and fires. Intelligent analytics can be used to cross-check consumption against monthly bills and similarly situated homes to pinpoint thieves. In Romania, right now, one utility has nearly tripled the number of power thieves it has identified with analytics.

This is not just a dream at this point. Utilities such as Oklahoma Gas & Electric and Austin Energy, have implemented programs that can curb power demand for air conditioning by 30 percent or more, without the customers even noticing a change, except for a lower monthly bill, and have avoided hundreds of megawatt of new power plant construction. Now imagine the impact if programs like these are rolled out broadly across the world. This simple idea of software controlling energy more intelligently can save more power than all the nuclear plants and hydroelectric dams of the world combined produce. And, this is only one of the many ways in which software can help.

Data doesn’t fit squarely at first glance with other forms of energy, but we have the opportunity to make it an integral element of our power system. Apart from being clean there is another big difference between data and the other resources.

It’s the only one that is growing.

This piece first appeared in the World Economic Forum blog.

Top GIF: Video courtesy of AutoGrid.

Dr. Amit Narayan is the Founder and CEO of AutoGrid, Inc. From 2010-2012, Dr. Narayan was the Director of Smart Grid Research in Modeling & Simulation at Stanford University where he continues to lead an interdisciplinary project related to modeling, optimization and control of the electricity grid and associated electricity markets. Dr. Narayan has published over 25 papers in the area of design automation, holds seven U.S. patents and is an active advisor to startup companies.

image

compete
china
india
bali
indonesia
wto
world-trade-organization
us
chamber-of-commerce
trade
global-competitiveness
development
economy
trade-facilitation-agreement
tfa
corruption
peterson-institute-for-international-economics
jobs
trade-disputes
agriculture
narendra-modi
modi
obama
information-technology-agreement
ita
tech
innovation
electronics
exports
Are India and China Ready to Lead on Trade?
10.15.2014

India and China are giants on the world stage, with a combined population of 2.6 billion — more than one-third of humanity. Adjusted for purchasing power, their combined economic output tops $20 trillion, well ahead of America’s $17 trillion.

China trade hero

However, when it comes to international trade negotiations, it remains an open question whether India and China are ready to provide responsible leadership commensurate with their growing economic heft. With two global trade pacts in the balance, the weeks ahead should reveal whether Beijing and New Delhi will seize the opportunity to lead — or if they will allow their own parochial politics to push these agreements’ huge benefits to the side.

India stunned the world in July when it blocked adoption by the World Trade Organization (WTO) of the recently concluded Trade Facilitation Agreement (TFA). As the first multilateral trade agreement in nearly two decades, the TFA was unanimously endorsed by the WTO’s 160 members at a meeting in Bali, Indonesia, last December.

The TFA is a cost-cutting, anti-corruption agreement that would streamline the passage of goods across borders by cutting red tape and bureaucracy. The Peterson Institute for International Economics has estimated it could boost the world economy by as much as $1 trillion and generate as many as 21 million jobs globally.

New Delhi is explicitly seeking leverage to secure a permanent “peace clause” to ensure its farm subsidies are not challenged before a WTO dispute settlement panel. In recent years, India has greatly expanded these agricultural subsidies and has done so in ways that may violate WTO rules. However, India won a considerable degree of protection for its subsidies in Bali.

A Way Forward on Trade Facilitation?

India has thus snatched defeat from the jaws of victory.The TFA would promote economic growth and raise living standards around the world, with most gains going to the world’s poorest. Moreover, concessions India won in Bali relating to its agricultural subsidies were joined with the TFA in the “Bali Package,” so India will obtain no relief for its demands without a deal.

In a recent interview, former U.S. Trade Representative Susan Schwab said: “It’s absolutely unconscionable that the new (Indian) government should ruin a deal that the previous government had reached after much agonizing by every WTO member.”

Already, many governments are considering how to secure the TFA’s benefits for the vast majority of WTO members that support its swift implementation. One option would be to implement it as a “plurilateral” agreement among all parties that are willing — and many are keen to move forward.

When India’s recently elected prime minister, Narendra Modi, visited the White House in late September, President Obama raised his concerns about the impasse and the threat it poses to the multilateral trading system. Modi expressed hopes of finding a compromise soon, saying: “India supports trade facilitation. However, I also expect that we are able to find a solution that takes care of our concern on food security.”

Modi has an opportunity to be a statesman. By taking yes for an answer — accepting the win India secured in the Bali Package — he would receive not only those direct benefits, but the opportunity to show leadership as a global power.

China Can Lead on IT Trade

For China, the question involves the Information Technology Agreement (ITA). Negotiated in 1996 under the umbrella of the WTO, the ITA has delivered a cornucopia of innovative technology products to the world over the past two decades.

Today, 70 countries are members of the ITA, accounting for 97 percent of world trade in IT products, which has reached about $4 trillion annually. This sum represents nearly one-fifth of global merchandise trade. Thanks in large measure to the ITA, China today is the largest exporter of IT products in the world.

However, the ITA is showing its age. Since it was concluded, innovative industries have developed a host of new products, such as GPS devices, Bluetooth technologies, and flat-panel displays. All remain outside the ITA’s reach. Expanding the ITA to cover these new products could add $190 billion to global GDP annually, according to the Information Technology Industry Council.

Despite progress, the ITA expansion talks have stumbled along in a stop-and-go mode over the past year. At issue is China’s push for tariffs on a wide range of goods to be dropped from consideration or subjected to long phase-out periods.

No other country has adopted such a cautious stance, and many WTO members have objected. The Chamber was one of 82 top business groups from dozens of developed and developing countries that issued a statement in September calling for action. However, an interagency battle in Beijing has hamstrung China’s negotiators for months.

Not only does China have a trade surplus in many of the product categories it has sought to exclude, it would be a principal beneficiary of an expansion of the ITA’s coverage. An ambitious outcome could save China’s tech sector $8 billion in reduced tariffs on overseas sales each year, the Information Technology & Innovation Foundation estimates.

As it happens, China is hosting the Asia-Pacific Economic Cooperation (APEC) meetings this year, with President Obama, Chinese President Xi Jinping and the leaders of the other 19 APEC economies gathering in Beijing in November. The ITA actually traces its origins to APEC, and many in the business community sense that these upcoming meetings represent a key opportunity.

From the Chamber’s perspective, reaching a final agreement on ITA expansion will be the key indicator of the success of China’s APEC year. With just weeks remaining, now is the time for China to take a leadership role.

On trade, when you stand still, you fall behind. For both China and India, these global trade pacts represent a chance to show leadership on the world stage and secure real benefits for their citizens.

John Murphy is Senior Vice President for International Policy at the U.S. Chamber of Commerce

 

compete
big-data
open-data
data
data-analytics
world-bank
government
competitiveness
global-competitiveness
development
accountability
uk
us
open-government
transparency
collaboration
innovation
fraud
pickup
wef
world-economic-forum
victoria-lemieux
lemieux
oleg-petrov
petrov
roger-burks
burks
university-of-british-columbia
canada
Why We’re Failing to Get the Most Out of Open Data
10.14.2014

An unprecedented number of individuals and organizations are finding ways to explore, interpret and use Open Data. Public agencies are hosting Open Data events such as meetups, hackathons and data dives.

Open data hero

The potential of these initiatives is great, including support for economic development (McKinsey, 2013), anti-corruption (European Public Sector Information Platform, 2014) and accountability (Open Government Partnership, 2012). But is Open Data’s full potential being realized?

news item from Computer Weekly casts doubt. A recent report notes that, in the United Kingdom, poor data quality is hindering the government’s Open Data program. The report goes on to explain that — in an effort to make the public sector more transparent and accountable – UK public bodies have been publishing spending records every month since November 2010. The authors of the report, who conducted an analysis of 50 spending-related data releases by the Cabinet Office since May 2010, found that that the data was of such poor quality that using it would require advanced computer skills.

Far from being a one-off problem, research suggests that this issue is ubiquitous and endemic. Some estimates indicate that as much as 80 percent of the time and cost of an analytics project is attributable to the need to clean up “dirty data” (Dasu and Johnson, 2003).

In addition to data quality issues, data provenance can be difficult to determine. Knowing where data originates and by what means it has been disclosed is key to being able to trust data. If end users do not trust data, they are unlikely to believe they can rely upon the information for accountability purposes. Establishing data provenance does not “spring full blown from the head of Zeus.” It entails a good deal of effort undertaking such activities as enriching data with metadata – data about data — such as the date of creation, the creator of the data, who has had access to the data over time and ensuring that both data and metadata remain unalterable.

Similarly, if people think that data could be tampered with, they are unlikely to place trust in it; full comprehension of data relies on the ability to trace its origins. Without knowledge of data provenance, it can be difficult to interpret the meaning of terms, acronyms and measures that data creators may have taken for granted, but are much more difficult to decipher over time.

A final issue is a lack of consideration for the need of data preservation and stewardship. This  can also prevent full realization of the benefits of Open Data, as evidenced by comments from a participant at one of the World Bank’s own data dives: “The big problem…is data curation…[organizations] think the structure of data is what gives it value. But it’s actually…what I call data curation….there is no archive…” (Sonuparlak, 2013).

Poor quality data, lack of information about data provenance and data stewardship issues present common barriers to implementation of Open Data initiatives. If this is an issue in the UK and other developed countries, how much more of an issue is it likely to be for the World Bank’s (often-developing) client countries?

Even though much public sector-related Open Data originates from official government records, in many countries even basic records management controls are missing, particularly in the digital environment. Without these controls, records are likely to be incomplete, difficult to locate and challenging to authenticate; where they exist, they can be easily manipulated, deleted, fragmented or lost. Poorly kept records, resulting in inaccurate or incomplete data, can lead to:

  • Misunderstanding and misuse of information;
  • Cover-up of fraud;
  • Skewed findings and statistics; and
  • Misguided policy recommendations and misplaced funding.

All of these problems lead to missed opportunities in maximizing the full value of Open Data.

So, what’s the solution? Fixing poor quality data after it is created or when it comes time to use it is costly and, in the case of historical data, often not even possible due to technological changes or lack of documentation. There is evidence to suggest that designing controls that will produce good-quality data in the first place is a far better strategy; that is, it is better to arrive at good Open Data by design.

The World Bank’s Open Data Readiness Assessment Tool can help client countries identify whether they are ready for Open Data along several dimensions, but two in particular:

  1. Institutional structures, responsibilities and skills within government
  2. Data within government

Institutional structures, responsibilities and skills – or what may be described as Information Governance – are important because “Open Data requires agencies to manage their data assets with a transparent, organized process for data gathering, security, quality control and release. To effectively carry out these responsibilities, agencies need to have (or develop) clear business processes for data management as well as staff with adequate ICT skills and technical understanding of data (e.g., formats, metadata, APIs, databases)” (ODRA, 2014).

Data within government is similarly critical because it builds “on established digital data sources and information management procedures within government where they already exist… good existing information management practices within government can make it much easier to find data and associate metadata and documentation, identify business ownership, assess what needs to be done to release it as Open Data and put processes in place that make the release of data a sustainable, business-as-usual, downstream process as part of day-to-day information management” (ODRA, 2014).

Designing the institutional structures, responsibilities and skills within government to produce good data means designation of one entity with sufficient political weight to coordinate Open Data matters across government. It also requires ensuring that Open Data policies are implemented, and that there is one agency or department responsible for information management – regardless of the form of the information (i.e. paper or digital). In regard to data within government, good design includes a comprehensive inventory of data holdings; coherent information management policies and standards, consistently enforced across government; and a process for digitization of paper records with infrastructure and processes in place for sustaining long-term digital record repositories. International standards point the way to other good practices.

While all of these design features will help produce good Open Data, there is still a lack of knowledge about the antecedents and contributing factors of good Open Data, as well as challenges on how best to implement initiatives in the context of the World Bank’s client countries. The conditions that produce good data require more in-depth study as a foundation for a design framework that can, for example, be built into an expanded Open Data Readiness Assessment Tool. Only with this understanding will it be possible to achieve good Open Data by design, and then truly realize the benefits of Open Data for development.

This piece first appeared in the World Economic Forum blog.

Dr. Victoria Lemieux is a Senior Public Sector Specialist (Information Management) and an Associate Professor of Archival Studies at the University of British Columbia (on leave). Co-author Oleg Petrov works for the World Bank’s Transport & ICT Global Practice, where he coordinates the Open Data Program. Co-author Roger Burks is an Online Communications Officer for the World Bank.

 

make
minds
machines
accenture
ge
survey
industrial-internet
big-data
internet
data-analytics
innovation
tech
staff
qa
matt-reilly
reilly
industrial-internet-insights-report
industrial-internet-insights-report-for-2015
internet-of-things
iot
minds-machines
fixed-feature
Industrial Internet Insights — Q&A With Matt Reilly
10.13.2014

There is a growing urgency for organizations to embrace Big Data analytics to advance their Industrial Internet strategy, according to a new report from Accenture and GE. However, less than one-third of the 250 executives surveyed for the study, released at GE’s Minds + Machines conference last week, are using Big Data across their company for predictive analytics or to optimize their business.

GE machines loop

The Industrial Internet Insights Report for 2015 — based on surveys of executives in eight industrial sectors across North America, Europe and Asia — looks in particular at the role that Big Data analytics plays in creating innovative solutions for the Industrial Internet. In an interview, Matt Reilly, senior managing director for Accenture Strategy, talks about the research and some of the insights it has generated:

Why focus on Big Data analytics in this study?

Reports suggest that by 2020 there will be 50 billion things connected to the Internet. With that kind of scale, and that much information available to companies, Big Data analytics is the primary way that we will actually derive value from all the data coming in from machines and devices. The Industrial Internet has the potential to drive trillions of dollars in savings, in addition to new services and overall growth. But to reap those rewards, industrial companies will need to first use insights about their physical assets and their customers to build new offerings, reduce costs, innovate and serve customers better.

What’s the most surprising thing that emerged from the research?

One of the most compelling things is that the vast majority of executives told us that they believe Big Data will change the competitive landscape in their industry, and fast. Eighty-nine percent said they worry that if they don’t develop an approach, they’ll lose market share. Another intriguing item was the level of urgency that executives feel about advancing their capabilities. A vast majority of executives — 84 percent, in fact — indicated that Big Data analytics could shift the competitive landscape for their industry just in the next 12 months.

How do the surveyed companies differ from one another in terms of their analytics capabilities?

A helpful way to understand this is to think of a maturity curve that begins first with connecting machines and devices, then moves up toward monitoring, and then analyzing. As companies get more sophisticated, they then move into more complex capabilities, like predicting and optimizing. A good example of this is a global energy company that is able to analyze tens of thousands of data points in its wind farms every second. Then they use the data to optimize the operation of their turbines.

What we found, however, is that most companies — about three-fourths of them — are still in the connecting-monitoring-analysis stages of development. About a third are already at analysis, and that’s a good sign. But clearly most companies have a ways to go when it comes to really getting the most out of their data and using it to advantage.

What are some of companies’ biggest challenges?

This is very interesting because one of the biggest concerns we saw in our research wasn’t technical but was instead about finding the necessary workforce talent to support their Big Data analytics strategies. As one of the executives we spoke to commented, it’s great to have the data but then you need the people to analyze it and generate insights. Better hiring is one obvious path to filling in talent gaps, but predictions point to a dramatic shortage of data scientists in the future. More than half of the companies we surveyed intend to combat that shortage by using skilled external talent who can quickly translate business needs into analytics solutions. It’s important that the machines are talking to each other, but more important that the humans know what to do with what the machines are saying.

What’s one recommendation you would make coming out of the report?

One of the biggest recommendations is to look at the Industrial Internet and Big Data as the means not only to improve operations and reduce costs, but also as a way to create growth through new offerings. For example, as companies generate insights from their data they have the opportunity to add digital services to the mix of what they sell. This is a point Accenture makes strongly in another report we recently released—that the Industrial Internet can be a huge innovation generator.

What makes this report different from others on the topic of Big Data?

There are obviously a lot of reports on Big Data analytics and the Industrial Internet out there. Many are focused on describing macro trends. We think what makes this report unique is that it reflects the specific, current sentiments and viewpoints of over 250 executives across multiple sectors and geographies about where they stand right now in harnessing analytics — and then what that means for the value of the Industrial Internet. The report gives executives not only a sense of what their peers are thinking but also the realization that, as yet, companies are only tapping into a small percentage of the potential value of the Industrial Internet.

Matt Reilly is a Senior Managing Director in Accenture Strategy, which operates at the intersection of business and technology. Accenture is a global management consulting, technology services and outsourcing company, with more than 305,000 people serving clients in more than 120 countries.

image

make
minds
machines
industrial-internet
big-data
data-analytics
data
iot
internet-of-things
innovation
connectivity
verizon
wireless
mobile
telecom
machine-to-machine
m2m
security
ge
infrastructure
mark-bartolomeo
bartolomeo
staff
qa
university-of-michigan
minds-machines
fixed-feature
How Wireless Makes the Industrial Internet Work — Q&A With Mark Bartolomeo
10.10.2014

The Industrial Internet is revolutionizing how businesses operate across a range of industries, with companies from railways to healthcare providers starting to reap the benefits of machine-to-machine connectivity.

Jet engine cropped

The need for secure connections at remote locations around the world — whether on an oil rig or inside a jet engine — is also creating closer collaboration between the providers of Industrial Internet platforms such as GE and technology companies such as Verizon. That’s why the companies have teamed up to expand the global Industrial Internet infrastructure, enabling the remote monitoring and maintenance of connected equipment no matter where it’s located and in a secure environment.

“The potential for transforming industries, including rail, aviation, energy and healthcare — as well as society as we know it — is tremendous, and yet the Internet of Things (IoT) is a nascent, complex and fragmented market,” Mark Bartolomeo, head of IoT Connected Solutions at Verizon, said in announcing the deal at GE’s Minds + Machines conference this week.

In an interview, Bartolomeo spoke about the benefits that early adopters of the Industrial Internet are enjoying, as well as some of the challenges ahead in rolling out smart technologies more broadly across industries:

How do you define Industrial Internet?

At its core, the Industrial Internet is any industrial equipment that is connected to a network, enabling information to improve efficiencies. The benefits can be vast, including improved service levels, new revenue streams and environmental benefits.

What are the broad benefits of Industrial Internet and Internet of Things (IoT) technologies for enterprise businesses?

I see three fundamental benefits. First, businesses can accelerate product development and deployment cycles by unifying information from diverse sources and applications. Second, the Industrial Internet introduces new revenue streams by allowing businesses to take advantage of the latest smart technologies before their competition. Finally, all these new connected devices produce a ton of data that can be disseminated and quantified for more reliable outcomes.

How have early machine-to-machine (M2M) and IoT adopters improved their businesses with connected technologies?

Early adopters have been driven by sustainability, safety and monitoring equipment in remote locations. Sustainability can provide a sizeable return on investment. If I manage the generation and consumption of energy, I’m also monitoring the theft of energy and helping lower my carbon footprint by reducing or eliminating a vehicle fleet required for manual reads. M2M monitoring and communication tools also have produced interesting case studies in the crude oil industry. The concept of scanning layers of earth for oil in North Dakota and sending all that information back to engineers and scientists in Houston where they are doing analytics is huge.

What markets are best suited for Industrial Internet solutions, and could you give some examples of the types of technologies?

There are plenty of opportunities across many different markets. In the energy and utilities industry, you see energy management tools; energy generation plants; meters; relay switches; load balancing generation; monitors for pipelines, well heads, pressure temperature, etc. At the municipality level, we are talking about parking, congestion management, firebox monitoring and video-as a-service. The healthcare industry has seen an increase in virtual visits.

In the transportation market, Verizon is part of the Leadership Council for the University of Michigan’s Mobility Transformation Center to test next-generation technologies in a real-world environment. Michigan’s Mobility Transformation Center is a public-private R&D initiative to establish a connected vehicle infrastructure in which cars can communicate with each other. One of the goals is to decrease the number of vehicular accidents through vehicle-to-vehicle technology.

What challenges are inherent in rolling out smart technologies?

Fragmentation and a lack of standards are the two biggest challenges currently hindering Internet of Things adoption. Businesses struggle with having to buy a smart technology from one company and a network from another and perform their own backend integration.

We must reduce complexity to broaden adoption. Right now, machine to machine and IoT have seen an adoption rate of less than 10 percent. However, adopters are benefiting from accelerated products cycles, new revenue streams and more reliable outcomes. The low adoption rate is due to a complex and costly development process. For example, if an energy company wants to deploy 3 million connected meters in seven years, what type of backend infrastructure should they build? Do you build to scale up to 3 million or do you build to support what’s in the field? If you build to scale 3 million and your rollout period is seven years, you have a lot of wasted backend capacity. If you don’t build it to scale, then you’ve underbuilt and are not reaping the full benefits of this smart technology. That’s why the concept of platform- and software-as-a-service makes sense.

What type of infrastructure is required to enable Industrial Internet solutions?

I see it as pervasive connectivity. The wireless network makes the Industrial Internet viable. I think that cloud is to 4G as 4G is to the Industrial Internet — it just doesn’t work unless you are connected. So the concept of the Internet of Things assumes that something is connected all the time — not casually. It’s impossible to link smart devices without wireless connectivity. I can’t connect pallets. I can’t connect manhole covers. I can’t connect airplanes or cars. However with wireless connectivity I can. M2M and Internet of Things technology doesn’t work without reliable, secure, expansive coverage. It’s also managed networks, devices with cellular and non-cellular sensors. We have to be able to play at that connectivity level beyond cellular, sensor networks, 2.4 GHz, mesh, etc.

Where do you see these types of technologies heading in the near future?

For the Industrial Internet and IoT technologies to thrive, they will require lower cost asset connectivity. It’s fine to connect a $25,000 vehicle or a $300 electric meter that’s tied to a $5 billion electric generation plant, but what about a $20 pallet or 40 million manhole covers? How do we connect those? The way we do that is by participating at that sensor level and getting more and more devices on the network by bringing down the cost of cellular. 

Mark Bartolomeo is head of IoT Connected Solutions at Verizon

image