ReadWrite Sponsors, Author at ReadWrite https://readwrite.com/author/readwrite-sponsors/ IoT and Technology News Fri, 31 Aug 2018 19:54:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 https://readwrite.com/wp-content/uploads/cropped-rw-32x32.jpg ReadWrite Sponsors, Author at ReadWrite https://readwrite.com/author/readwrite-sponsors/ 32 32 Meet Some of the Women Building Comcast’s Next-Generation IoT Experiences https://readwrite.com/meet-women-building-comcasts-next-generation-iot-experiences/ Wed, 03 Jan 2018 23:06:54 +0000 https://readwrite.com/?p=99785 Rules - Digital Home

By Ashley Reed, Human Resources for the Digital Home Team at Comcast Cable For Internet of Things engineers, there’s a […]

The post Meet Some of the Women Building Comcast’s Next-Generation IoT Experiences appeared first on ReadWrite.

]]>
Rules - Digital Home

By Ashley Reed, Human Resources for the Digital Home Team at Comcast Cable

For Internet of Things engineers, there’s a lot to be excited about these days, but one thing that’s been particularly inspiring to me – as a woman who is continuously on the lookout for talented software engineers – is the emergence of so many brilliant, dynamic women who are helping create the next generation of connected home experiences.

While we still have plenty of work to do to create more diverse teams in software engineering, the progress we’ve made as a technology community is inspiring.  If you visit one of our IoT centers of excellence in Philadelphia, Austin or Silicon Valley, you’ll find any number of scrum meetings that are mostly made up of women are being led by women or both.

We know this is critical because diverse teams are more innovative. In a 2015 McKinsey & Company study, Mckinsey found that more diverse companies perform better financially. All the evidence we’ve seen – both anecdotal and empirical – reinforces the real-world value of diverse teams.

We also know that any progress we make has an amplifying effect.

When women engineers come to interview with us, they often ask to speak with other women in the organization. Having so many talented and diverse people to refer them to at all levels of our organization helps us attract critical talent.

There are too many stories to share, but I asked three of my colleagues to tell their stories, and have collected those here.

Creating Seamless Connected Home Experiences

Jugnu Gupta is senior director of product management, leading the IoT products and partner ecosystem for Comcast. She focuses on building the technology behind the Xfinity Partner Program, which lets Xfinity customers control a large and growing array of IoT devices like Nest Thermostats, Philips Hue Lights, and August Smart Locks, all from their Xfinity Home hubs.

Jugnu is also a leader on the team that built connected home “scenes” for Xfinity Home.

This lets customers set simple scenes like “Good Morning” or “Leaving” that prompt Xfinity Home to seamlessly perform a number of actions like turning on specific lights, arming or disarming the security system, and adjusting the temperature.

For Jugnu, work is driven by a deep passion for the power of IoT to improve people’s lives, and a firm belief that connected home experiences should be for everyone, not just techies, and early adopters. “I work very hard every day to build experiences that are easy to understand, enable and use on an ongoing basis,” Jugnu said. “With relevant recommendations that are personalized to each customer, simple first-time experiences, intuitive controls and automation, our customers can now use IoT products with minimum effort and even limited technical know-how to make their lives better.”

Creating Magic Through Simplicity

Tina Kim, a senior product manager on the digital home team, did not start her career at Comcast working on IoT products and experiences. Instead, she managed the powerful Platforms Rules Engine that the digital home engineering team leverages to build connected home products and services. To Tina, who is based in Comcast’s Silicon Valley Office, the IoT group was just one of many tenants using her team’s platform.

As she started working closely with the Xfinity Home team, she became a personal convert to IoT as well, installing IoT devices at home and tinkering with them to find the perfect balance. Now she’s a firm believe both in the potential of IoT, as well as the importance of making it accessible to customers.

“I envision a future where automation lives across virtually all of our devices.  The real magic happens when devices just do what we want them to do with little or no demand on the user,” Tina said. “Imagine leaving your home and your lights turn off, Wi-Fi network locks down, garage opens, and your car starts to warm up based on the temperature outside. That future is now, and what’s coming down the road is even more exciting.”

Creating Peace of Mind with Smarter Cameras

For Sarju Mehta, Senior Manager of Software Development and Engineering on the Xfinity Home team, working in IoT was an opportunity she couldn’t pass up. Sarju started her work at Comcast on the team that was building the company’s e-commerce platform. While she thrived on that team which adopted the pace and urgency of a new startup, the appeal of taking on new engineering challenges was too strong when an opportunity opened up in the digital home team.

Sarju works on the video platform team, which focuses on building greater intelligence and functionality into the recently redesigned Xfinity Home security cameras. For Xfinity Home customers cameras play a critical role in providing peace of mind, and Sarju’s team has been working to make them better and smarter. Improvements included AI-powered computer vision and more seamless integration with other Xfinity products including apps and the X1 platform.

“People use our cameras to have peace of mind in terms of their security needs.  My job is to ensure that performance and reliability always stay in the forefront of our minds because they are fundamental to our products’ success,” Sarju said.

Looking to the Future

Working with so many brilliant engineers (both women, and men), it feels greedy to say it, but we still need more.  As we continue to offer dynamic new IoT experiences to customers, our need for talented engineers only continues to grow. If you want to work with these great women and more check, out our job listings

Digital Home.

The post Meet Some of the Women Building Comcast’s Next-Generation IoT Experiences appeared first on ReadWrite.

]]>
Pexels
[Interview] Industry Experts on Choosing Your IoT Network https://readwrite.com/nokia-interview-choosing-iot-network/ Thu, 16 Nov 2017 15:30:27 +0000 https://readwrite.com/?p=99699

Jason Elliott, 5G market development manager Samuele Machi, marketing manager, 4th industrial revolution   ReadWrite: When we talk about networking around […]

The post [Interview] Industry Experts on Choosing Your IoT Network appeared first on ReadWrite.

]]>

Jason Elliott, 5G market development manager

Samuele Machi, marketing manager, 4th industrial revolution

 

ReadWrite: When we talk about networking around IoT there’s a lot of smaller networking models e.g mesh networks and protocols. Everything from that to large networks talking about 5g deployments. So, there’s a lot of issues for large enterprises to plan around IoT. For you two, what are the biggest issues to consider for an executive that’s choosing the network for their own IoT deployment?

Jason: It depends on the business need.  You need to ask yourself questions such as:

  • Do you have an immediate business need such as a one plus year time frame?
  • Are you looking at a strategy type decision over the next five years?
  • Do you want to build expertise?
  • What’s your investment model?
  • When thinking about digital automation and connectivity and automating your enterprise your enterprise, do you want to control the entire set of infrastructure?
  • Are you actually building your staffing resources and infrastructure to control and manage the network yourself?
  • In terms of spectrum, are you going to build it yourself or are you going to partner with a wireless provider to do that?

These are critical, strategic business decisions that you need to address and then you decide upon the underlying technology that will help you realize the individual use cases.

There are existing technologies that can be used today that could meet certain business requirements at a limited scale, which can then be expanded and extended to include more mission critical functions when you deploy 5G.

You definitely want to apply the right piece of technology for that particular business case. And then there’s different types of investment cycles.  For example, obviously there’s a lot of mature technologies out there today that may be mid to lower cost that you could invest in.  This decision may yield shorter term operational savings that you are looking for.  However, for longer term needs such as providing new use cases that increase revenue it may be necessary to invest in a bigger and better technology such as 5G.

Samuele: There are major things that executives should take into consideration before planning connectivity for an IoT project and in fact, it isn’t easy. The first consideration is the type of use case you want to enable because they are all different and require different levels of reliability and different types of latency.  For example, say you want to connect your parcel which is traveling the world and you want to know where it is, this is different from connecting an autonomous vehicle in a factory or maybe in a harbour.  They are both IoT use cases but completely different.

The next consideration is what connectivity networks are available in your area? The key questions to ask yourself are: Do you have a public IoT cellular network available (eg. NB-IoT or LTE-M), or will/can you purchase/lease spectrum so that you can build a licensed LTE private network, or do you have to use some unlicensed /shared spectrum LTE based technology (e.g. Multefire and CBRS)?

The final consideration is about the existing ecosystem. If you want to use a very new technology which has not been associated with that many devices yet, you have to have a plan. Maybe you need to galvanize your local ecosystem into action to speed up things. You have to be aware of how long it might take for you to get the pieces that you need to build your use case.

I would say these are the three key steps to think about (use case, connectivity and ecosystem), but of course, there are many sub items And details associated with them.

 

ReadWrite: I think we’ve dovetailed into number 2. Can you elaborate on the types of use cases that you see and how those networking use cases might start out?

Jason: We’ve been talking a lot about the industry 4.0 because that’s where we see the potential for a lot of transformation. If you look at it, there’s different types of industries such as: manufacturing, construction, power generation, and distribution. . Let’s look at a new possible business model for the ‘process’ industry like chemical manufacturing. Instead of taking raw materials and just creating a final product that is sold, they could provide tighter integration into their customers operations. Allowing them to make tailored products or offer analysis services.  The ability for a business to interact with partners and customers at different parts of the value chain is important. Building flexibility into the infrastructure allows you to be able to do that. It is critical.

Today proprietary systems are in place because a business has one specific part in the value chain. However, what should be done is take a big step back and ask yourself, ‘OK how could I sell my product or my services that I create at any part of that value chain. What do I need to do with my infrastructure to be able to enable that and become a much more flexible and agile business?”  Once these are addressed, then the conversation changes to things like building a flexible network architecture, using fundamental technologies like NFV and SDN, being able to automate all those processes using advanced analytics (AI) and ensuring security. Looking at the problem from a business perspective is the first step and then identifying the right set of technology tools comes next.

Samuele: You also need to take into account whether you are going to ask your network operator for a dedicated piece of their public network, or will you build your own network. Think of when you do speed tests.  Your “score” does not really depend on you, does it? Basically, if your business model is that you want to use the mobile network so that you can easily deploy whatever use case or device whenever you feel like, you have two choices.  You can either make the wireless network a part of your IT infrastructure so you have full control over it (e.g., you go and place your access points and you provision the devices, and so on) or you can ask a communication service provider to do it all for you.

You need to start playing in IoT today and gain some experience with the technologies that are available now (eg LTE based technologies + edge computing) in order to be ready to capture the full potential of the 4th industrial revolution that will be powered by 5G.

 

ReadWrite: We talked about edge computing. We know that around IoT everything seems to be covering at the edge and it’s not just connectivity or compute capacity but also energy as well. When you think about an IoT network, you think of them meeting all these utilities. I use the example of an autonomous vehicle because it happens to be the largest, sexiest appliance in an IoT network that everyone likes to talk about, but it’s also one of the biggest consumers of all of those three things, those utilities if you deploy a network. 

How do you see energy and compute capacity factoring into a connectivity network of choice for an executive? 

Samuele: Regarding mobile edge computing, we see more and more IoT data being processed at the edge and this is estimated to reach around 40% of all data within the next couple of years.

 

ReadWrite: Do you know what the percentage is now?

Samuel: I do not have the latest figures but it is negligible.  Also, a number of cloud providers are now rolling out solutions working at the edge clouds.  It’s a big growth area and the EDGE concept may mean something different depending upon who is talking about it. Nevertheless, we all agree it means we want to essentially minimize the distance from where the data is generated to where it is collected and processed. There are a few reasons why we do that:

  1. It’s related to the speed of light. Even if the speed of light is very fast, there are some applications where milliseconds delay, say a control system, might not be feasible. If you imagine something like a system control in a factory, you need a millisecond or less to make sure everything runs smoothly.
  2. When you think of the amount of data that is created, for example, an airplane creates a lot of data during each of its trips. It’s not very meaningful as most of that data is raw data.  The significant data may only be an outlier taking place, such as a warning, and that is the part you have to pay attention to. Edge computing provides the capability for the data to be analysed locally and only a small amount is actually transferred.  In the end, there is cost reduction in the transmission bill.
  3. Another aspect is related to the privacy of the data. Edge computing makes sure that data that is created locally stays local. There are many cases where regulators make data stay in the country where it was generated.  Also, certain companies may feel more comfortable when they know the data stays in the company and never goes out.

For us, the edge is a data center because you need a lot of processing power available.  Not every IoT application needs this. Edge computing makes sense when one or more of the above 3 requirements exists. You hear a lot about Low Power Wide Area Networks (LPWAN) that are designed to minimize energy consumption of smart objects so that you won’t need to change billions of batteries every year.

Not all IoT use cases aim at minimizing energy consumption. This might be a priority for a gas meter but not crucial at all for a remotely controlled vehicle.

Jason: Back to the flexibility point of view, because with Multi-access Edge Compute (MEC) not only are you processing the data closer to the access network, it’s accessible and controlled by the enterprise, so deploying an application on a MEC server becomes simpler and faster.  Flexibility and local control is very powerful in the case of IoT when dealing with the type and quantity of data that gets collected and the applications that get hosted rather than having to go back to a centralized cloud that might be 3rd or hybrid owned.

Samuele:  By the way, edge computing is the key component of what we call Future X network.  Once we get to 5G, cCore edge cloud it’s a natural evolution of Nokia Edge Computing.

 

ReadWrite: Networks require constant upgrading to keep up and that requires investment from whoever your partners are going to be or new participants to come in and disrupt the services or technology but those all have time frames as well. Will today’s IT technology look dramatically different in 5-10 years?

Jason: From my perspective, we’ve had cellular IoT out for a while.  What you’re seeing now is some wireless providers turning off their 2G networks and it is taking a long time to get there. Previously, networks were designed for different specific purposes. So, if you think about 2G, it was designed for voice, 3G for web and data, and 4G for video. When you make an investment in IOT, you’re making an investment for a number of years. Particularly when you’re deploying large number of devices and they’re embedded in the ground. In those individual use cases, you will use the existing technologies that you have today and they will serve their particular use case within their lifecycle.

In terms of 5G, we see the fundamental design criteria differently from before. We are going from just a few bits per second to gigabits per second. We see 5G as more of a unifying technology in the longer term. So you might deploy IoT using today’s technology and once that’s served its lifecycle you can swap those devices out using 5G.  By the time we get to a certain point, we might see maturity in a 5G environment where you would have that capability and you start to transition those devices piece by piece and that’s just purely from the radio access side.

Instead of having a separate networking technology environment, the goal would be to have this underlying access technology that could cope with all of them. Once we get to a critical mass to scale from a cost and overage perspective that’s when it becomes very powerful.  However, critical mass acceleration and adoption won’t happen overnight.  It takes a while.

 

ReadWrite: What should you expect from your provider and what are the main concerns they should address as a partner?

Samuele: A few things to think about are:

  • Spectrum coverage
  • The types of interference you might encounter in your IoT deployment
  • What capacity is the operator providing
  • What level of security is guaranteed

These are the key parts I would want to ensure with an agreement with a connectivity provider.

Check coverage: check if the service is everywhere for everything you want to connect to. For example, can you get every corner of your factory connected? Is there some kind of connectivity hole? You might need to go and check with the right tools on the field. Interference or poor connectivity will jeopardize your IoT applications. Also, you want to check if the bandwidth you need is available at any time you need them.  If you don’t have a (semi or fully) private network, it means that anyone could be using the some of the uplink capacity you need.

Then, finally, security is extremely important, connectivity as well as every endpoint needs to be secured. You want to avoid data manipulation or loss.

Jason: I’d add the management of the device as well. Can you do diagnostics on it, firmware upgrade, getting information out of it. How that device is managed and how you extra that data from it is also very key.

The post [Interview] Industry Experts on Choosing Your IoT Network appeared first on ReadWrite.

]]>
Pexels
Why LTE-M is a game changer for IoT https://readwrite.com/lte-m-game-changer-iot/ Fri, 13 Oct 2017 17:21:10 +0000 https://readwrite.com/?p=99548

LTE-M promises to transform the Internet of Things (IoT) by connecting more “things” in more places than ever before.  We […]

The post Why LTE-M is a game changer for IoT appeared first on ReadWrite.

]]>

LTE-M promises to transform the Internet of Things (IoT) by connecting more “things” in more places than ever before.  We spoke with Cameron Coursey, vice president of IoT Solutions, AT&T, about what companies need to know.

What is LTE-M and why is it a game changer for IoT?

LTE-M connects IoT devices and applications directly to a 4G LTE network without a gateway. LTE-M technology is designed specifically for IoT devices and applications.  It’s ideal for alarm panels, smart cities wearables, metering and more. And LTE-M provides carrier-grade security across our nationwide LTE network.

What are the benefits of LTE-M?

LTE-M is capable of providing:

  • Longer battery life (expected up to 10 years).
  • Better coverage for IoT devices underground and deep inside buildings.
  • Smaller module size – as small as a penny.
  • Lower costs – modules priced at well under $10

Why is selecting the right IoT network so important?

Connectivity is the first step. It’s oxygen for the IoT.
Success in the IoT marketplace depends not only on your device or app, but on choosing the right network, at the right time, in a rapidly evolving market. You need to connect to the network that best fits the specific demands of your IoT devices.

Building a successful IoT solution is all about matching your connectivity needs to the right technology or mix of technologies. Whether you choose one network technology or take a multi-network approach, you want the best blend of coverage, performance, and value.

What should businesses consider when choosing an IoT network?

I tell businesses that they should consider a number of factors:

Coverage requirements can be very different for various IoT devices.  Some move frequently and need the breadth and scale of the cellular network. Others are so remote that only satellite services can effectively communicate with them.  So, there are device and use aspects to be considered.

Mobility is also important – knowing whether and how your IoT devices will be mobile is essential for technology selection. Each wireless network technology has a range or distance that limits a device’s ability to communicate to network access points. Some technologies work best for fixed locations, others can support limited or low speed movements, others support full mobility.

Battery life plays an important role in the design of IoT devices. If batteries can last for months or even years on a single charge, more things can be connected in more places for longer periods of time.

Cost is a major factor in virtually every project. These include one-time hardware and deployment costs as well as recurring costs, which include network services, management platforms, cloud data storage, maintenance and logistics, and security.

AT&T stood up its LTE-M network in the U.S. earlier this year. What can you tell me about that?

In May, AT&T deployed its nationwide LTE-M network through software upgrades at existing cell sites. The LTE-M network is now live on our nationwide 4G LTE network. We plan to deploy LTE-M across our LTE network in Mexico by the end of 2017 to create the first North American LTE-M footprint.

Our nationwide LTE-M deployment marks another step forward on our path to 5G and massive IoT. We can now reach new places and connect new things at a price that’s more affordable than ever before.

How do businesses connect and develop on the AT&T LTE-M network?

The AT&T LTE-M IoT Starter Kit gives designers the tools to develop and prototype IoT devices for AT&T’s LTE-M network. The kit includes all the elements required to collect sensor data, connect to the AT&T network, and utilize various cloud services for the management, storage, and analytics of a connected IoT device.

At AT&T, we help companies of every size develop IoT solutions to lower costs, gain efficiencies, and improve competitive advantages. For more information on our complete suite of IoT solutions and services, visit us at marketplace.att.com.

The post Why LTE-M is a game changer for IoT appeared first on ReadWrite.

]]>
Pexels
How to Get Started Building Your IoT Solution Today – Five Tips from Industry Leaders https://readwrite.com/iot-solution-five-tips-industry-leaders/ Tue, 03 Oct 2017 19:46:35 +0000 https://readwrite.com/?p=99444

At Microsoft, we’re committed to helping businesses capitalize on the enormous opportunity that The Internet of Things presents, but since […]

The post How to Get Started Building Your IoT Solution Today – Five Tips from Industry Leaders appeared first on ReadWrite.

]]>

At Microsoft, we’re committed to helping businesses capitalize on the enormous opportunity that The Internet of Things presents, but since the term “IoT” covers so much terrain, we understand it can be hard to know where to get started.

There are often numerous questions about security, standards, and deployment, and the two big problems we get asked to solve for is: where do you focus your attention when it comes to IoT and where do you start?

At the IoT in Action event we held in March in San Jose, we spent time talking to IoT experts within Microsoft and at partner companies like Arrow, Avnet, Advantech and others, as well as system integrators, ISVs and OEMs, who are solving for these challenges today. We asked them for their thoughts on how to get started with an IoT solution.

Read below on what they had to say and register for a new IoT in Action with Microsoft event in Boston on Oct.30th to see what’s possible with IoT, get educated, and find the right partners to start build your own IoT solutions.

Here’s what they had to say.

  1. Understand the platform first

It’s important to know what IoT can and can’t do for you.  Many companies don’t see the true value of investing in IoT because they don’t understand how to stand up a platform to build off and evolve their IoT application

So, choosing a partner that can show you the way their platform works and how it can be integrated into your everyday business process is crucial. Does it need a consulting partner? What devices and machines need to be connected? And what happens to the data once it flows in?

Delve into our IoT site on Microsoft.com to find information, case studies, white papers, and educational videos that will help you understand IoT better and help you make informed decisions on what to do next.

Highlights from IoT in Action with Microsoft in March 2017

  1. Know your strength and where you’re going

Every company has a core strength. Find yours and you’ll be able to generate actionable data and learn from it. How do you identify your core strength? Ask yourself what vertical you play in, what type of customers you serve, and what they need that an IoT solution could cater to.

A large number of IoT initiatives don’t have a plan that defines what it’s going to look like in the end, who your audience is, and how you’re going to monetize your IoT platform. You need to decide early if the purpose is for cost savings, new revenue models, or increased customer engagement.

Once again, ask yourself, “What’s my value proposition and what do I bring to the table?” There are a lot of different applications within IoT, so you should be able to articulate your value to your ecosystem partners and be able to find ways that you can turn those elements into a viable solution.

“What vertical do you play in? What experience have you been creating over the years? What types of enterprise or customers do you serve? What are your partner assets? Get started where you have a core strength, a core set of value props that you can deliver.”

Rodney Clark, VP IoT Sales, Microsoft

  1. Predict the intent of your customer

The IoT industry leaders we spoke to suggest that winners in IoT will always be those who gather, analyze, and understand customer intent and can predict what they might do next.

For instance, while you can get data and analytics on how many times someone walks into your store by installing an in-store sensor, it’s another thing to interpret that data to get inside the head of the customer to understand why they are there and what they might do next.

The best place to begin before you start to figure out your customers is what your end goal is. It’s not enough to have the latest piece of hardware or software; it’s what you do with it.

 Screen Shot 2017-09-29 at 12.32.27 PM

  1. Be very selective in the area of IoT you pursue

There are lots of paths to get you to where you want to go, you just have to figure out a final destination. These range from IoT operating systems that fit your device of choice or large, robust cloud services that gives you the bandwidth and security your product or service requires.

Which one serves your needs? Once again, it’s important to partner with the right people, whether it be a gateway provider, a cloud partner, or an integration partner who understands the area you want to pursue.

Once you know your niche, you’ll start to gather the right team to help you excel in your business with the end customer in mind.

“Don’t try to boil the ocean. Be very selective about what you do, and stay focused in connecting with the right partners in the ecosystem.”

Manik Rane, Managing Director, Grail Research

  1. Make a commitment and find partners

The best way to commit is to start with a goal in mind: are you going to increase revenue, try to save costs, or engage customers? Then, commit! Once you commit on the enterprise level, you’re ready to find other partners that exist in your ecosystem that are ready to commit with you.

“Decide what you want to do, set up your goals, and then commit to it!’

Tom O’Reilly, GM, IoT Device Experience, Microsoft

Given the success of the IoT in Action event we partnered with ReadWrite in March 2017, we are doing it all again on 10/30 in Boston.

Register for IoT in Action with Microsoft in Boston and learn where else in the world you can connect with IoT ecosystem partners to take action on IoT.

The post How to Get Started Building Your IoT Solution Today – Five Tips from Industry Leaders appeared first on ReadWrite.

]]>
Pexels
How Microsoft helps IoT pros take action to overcome challenges https://readwrite.com/how-microsoft-helps-iot-pros-take-action-to-overcome-challenges/ Thu, 21 Sep 2017 23:24:54 +0000 https://readwrite.com/?p=99408

As IoT becomes more pervasive across industries like retail, manufacturing, energy, security, and healthcare, many businesses are beginning to get […]

The post How Microsoft helps IoT pros take action to overcome challenges appeared first on ReadWrite.

]]>

As IoT becomes more pervasive across industries like retail, manufacturing, energy, security, and healthcare, many businesses are beginning to get a handle on the latest tools and platforms that help streamline the implementation of IoT into their business models and processes.

Despite the excitement over IoT and what it can do, there are companies that still see a few challenges remaining.

At the Microsoft IoT in Action event earlier this year, we asked partners and customers to give us their thoughts on where they encountered bumps on the road to turning their IoT ideas into reality, and share some suggestions how to overcome them.

Here’s what they told us.

#1: How to get started

There are practical steps organizations can take to get started in IoT. Start with the devices that you already have and connect them, ensuring that you: 1) secure each component of your IoT infrastructure, 2) secure your data connection, and 3) use a secure cloud infrastructure that offers a great security posture from sensor to cloud. Be sure to focus on your niche and what you do well, connect with the right partners in your ecosystem, and provide value with the end customer in mind.

#2: Connectivity between devices

According to IoT experts, the key things you need to keep in mind when you’re looking into connectivity are: 1) the ability to connect devices to the platform you’ve chosen, so they can they talk to one another, 2) harvesting and storing your data, and 3) using analytics to take action on all the data, so you can do something important with it.

Highlights from the IoT in Action San Jose Event in March 2017

#3: Have an ecosystem and collaboration

There’s power in partnership when it comes to IoT. The days of working in silos and trying to dominate the business landscape are dying. To start implementing and receiving the benefits, tap into an ecosystem. Without one, it can be tricky knowing where your product and services fit into the broad industry and use-case solutions. 

It’s important to determine where you are in the value chain and then figure out which partnerships make sense. Other things the experts think you should keep in mind? For starters, you could look for partnerships with other ecosystems partners that have expertise in common industries or use cases. 

“The biggest challenge I see now is the lack of recipes to solve business problems. All the companies that are part of the IoT ecosystem need to create known-good recipes that we can refer to clients and decision makers to help them go to market quicker.”

Shawn Jack, Director of Sales & Embedded Ecosystems, Advantech

#4: Think about security

With more access points to information, comes the risk of data breaches. The question of cloud security still rears its head and with cyber-attacks in the daily news, it’s a valid concern. When you’re tapping into your ecosystem, look for a partner that addresses your security concerns right away (for customer and consumer scenarios) and offers solutions. Also, make sure you’re getting the most up-to-date, modern security patches and scenarios—from device to cloud.

“Security is where I start a lot of conversations with companies. They need to have a security solution that’s not only device-driven, but the way to manage devices through cloud scenarios.”

Rodney Clark, VP IoT Sales, Microsoft

#5: How do you make money?

When it comes to business models and how to generate revenue from your IoT investments, experts say that the key here is to find partners who provide flexibility and adapt to the environment of the customer—not the other way around. 

Most companies believe that having a few tools and a cloud partner are all they really need, but they still may not be sure how to implement those offerings. Partners that provide package solutions with tools that companies can quickly and easily use themselves, see a proof of concept quickly so they get management approval for the project, then go to production, will be the ones that make money. 

“The majority of IoT initiatives don’t really have a full plan, a plan that includes exactly what you are going to do, who you are going to address, what are the end deliverables, and how you are going to make money out of it.  Companies need to remember that it has to be commercially viable.”

Cameron Carr, Senior Marketing Channel Manager, Microsoft

#6: The software aspects & data analysis

So, once companies have collected all this data, what will they do with it?

One solution offered by experts is to make sure that as companies gather data and analytics back into their system, information is used to actually change the content dynamically. No one wants out-of-date data providing out-of-date results.

“The biggest challenge is the volume, the velocity, and the variety of the data we are collecting. It’s great to capture all this information, but what do you want to do with it?”

Joe Francica, Managing Director, Geospatial Industry Solutions at Pitney Bowes

#7: Deployment of the products after they are complete

The problem doesn’t seem to be the development of products. There are lots of cloud, gateway, and touch panel solutions that are IoT ready. The problem is getting them to the right partner.

That’s why connecting with like-minded partners will alleviate future IoT challenges, so customers, partners, and clients can reap the benefits.

Given the success of the IoT in Action event we partnered with ReadWrite in March 2017, we are doing it all again on 10/30 in Boston.

Register for IoT in Action with Microsoft in Boston and learn where else in the world you can connect with IoT ecosystem partners to take action on IoT.

The post How Microsoft helps IoT pros take action to overcome challenges appeared first on ReadWrite.

]]>
Pexels
Who are the right partners to get you into more smart homes? https://readwrite.com/comcast-xfinity-smart-homes/ Wed, 13 Sep 2017 02:00:40 +0000 https://readwrite.com/?p=99359

The smart home revolution is still in its early days. Even with the programmable devices that exist already — the […]

The post Who are the right partners to get you into more smart homes? appeared first on ReadWrite.

]]>

The smart home revolution is still in its early days. Even with the programmable devices that exist already — the thermostats that learn your patterns, the voice-controlled assistants and the lighting that can be customized according to your needs and daily routines — there is still a large unexplored vista ahead of us with the connected home.

Some validation of all these new market participants would be helpful to consumers during their buying decisions. So as an integral part of the communications fabric tying all this technology together, Comcast has relaunched its partner program, which curates and integrates best-in-class smart home devices into the Xfinity Home platform.

See also: Five rules about entrepreneurship on large enterprises

Their point of view is that smart home IoT devices don’t really make sense unless the devices are truly connected, so they can talk to each other and create the ultimate smart experience for consumers when they are at home or away.

To do that, they created a curated program with several best-in-class partners including the recently added Philips Hue connected lighting system, Nest Learning thermostat, August door lock, Lutron Caséta wireless controller and dimmer, Chamberlain MyQ garage door controller, light bulbs Sengled and GE by Jasco

These partnerships allow Comcast’s Xfinity Home customers to manage and control all of these smart home devices from one platform – the Xfinity Home platform. Additionally, Xfinity Home customers have the ability to seamlessly troubleshoot issues with any of the partner products by simply calling the Xfinity customer service center without having to call individual partners for support or assistance.

As more companies integrate with Comcast’s partner platform, we sat down with two partners — Martin Heckmann, Director of Emerging Business at Chamberlain Group, Inc., and James McPhail, CEO of Zen Ecosystems — to find out what the certification process was like to join the Xfinity Home platform, the benefits of the partnership and what they have learned from it.

Readwrite: With all the channel opportunities out there, what led you to be a partner with Comcast’s Xfinity Home?

James McPhail: Our relationship with Comcast goes back many years. They were one of the original inspirations to build the Zen Thermostat. They shared our vision for a thermostat that was more attractive, simple and sleek – differentiating the Zen Ecosystems product from other connected devices on the market. Comcast actually encouraged us to start this adventure.

Martin Heckmann: We believed Comcast was doing great work to make the connected home a reality for consumers and we were proud to join their platform. It was a great opportunity for Chamberlain customers who have Xfinity Home to receive benefits such as alerts to their phones via the Xfinity Home mobile app every time their garage door opens or closes, which provides peace of mind.

RW: The selection criteria to be included in Comcast’s Xfinity home partner program is pretty rigorous — walk us through what you think of their expectations of you as a partner.

JM: The selection criteria are indeed rigorous, but absolutely make sense. Comcast tests every device with a dedicated engineering team. They have a platform with a vast configuration of various equipment that they continuously test, ensuring a more reliable device for their customers. The size and scale of their structured support is unmatched, allowing them to respond to and troubleshoot issues quickly and efficiently, making for happier customers. This helps us stay focused on improving our products.

MH: We were pleased to learn during the process that Comcast and Chamberlain are completely aligned in our expectations to provide the most advanced and secure connected devices and services to consumers.

RW: What did you have to do to prepare for this process? Was there anything unexpected?

JM: Comcast’s test team expressed their philosophy as “trust but verify.” While they wanted to review all the details of our product, they were also willing to collaborate on troubleshooting. It was never a pass/fail situation. They wanted us to be successful, but we needed to prove we could be! What surprised us initially was that many of their tests put the Zen Thermostat in very extreme conditions — well beyond those normally found in a customer’s home. While this initially felt excessive, through the process we learned about many of the atypical real-world situations that could potentially occur, and how our thermostat performed in those circumstances. As a result, we have a far better understanding of the ability of our product to perform in all kinds of conditions.

MH: Chamberlain had completed, or was in the process of completing, several other partner integrations during the integration with Comcast so there were really no surprises. That said, it was reassuring to see the rigor applied to the partner program at Comcast.

RW: Security issues are constantly a big factor in smart home deployments. What’s the relationship between your firm and Comcast on this critical issue to end customers?

JM: At Zen Ecosystems, the security of our devices is of the highest concern. We know from our engagement with Comcast that they are extremely diligent and rigorous on this issue. As a ZigBee device, connecting to the Xfinity Home hub, we are a member of the ZigBee Alliance, an organization whose members work to define a good balance between smart, usable devices and secure interactions between devices, and to certify that ZigBee devices are communicating according to protocol.

MH: Chamberlain and Comcast technical and customer support teams work very closely on an ongoing basis to monitor the technical integration and communication with customers.

RW: For other smart home technologies out there that are not part of the Xfinity Home program, what should they be thinking about if they’re considering it?

JM: There are a lot of benefits to engaging and working with Comcast. Daniel Herscovici, General Manager and Senior Vice President at Xfinity Home shared recently that the founding team, “are entrepreneurs at heart. We move quickly and pivot as needed to help us grow.” At the same time, Xfinity Home’s diligent testing process will absolutely help identify ways to improve your product that you may not have previously considered.

MH: As mentioned previously, the Comcast Xfinity Home partner program has high expectations of its partners. Any company considering joining the Xfinity Home partner program must be prepared to work collaboratively with Comcast to ensure the most secure technical integration and best in class consumer experience.

RW: How does this opportunity with Comcast’s Xfinity Home prepare you for the next channel launch? Did this process light a path towards “best practices” for this type of ecosystem building?

JM: Through working with Comcast, the Zen thermostat has evolved to be even better – while it was already simple, streamlined and attractively designed, we’ve been able to improve it even further. And, Comcast’s seal of approval absolutely carries weight and has already opened several doors for us. We look forward to expanding our network of world-class partners and adding our energy-saving thermostat to more home automation systems.

MH: This process reinforced Chamberlain’s plan to strategically partner with best in class connected-home solutions provider to offer consumers with the best experience and benefits that really matter in their everyday lives.

Many large companies seek partners especially those who are smaller and nimble as long as the quality of the product is intact. Partnering with a large company has many benefits as we learned from Martin and James – it makes sense to keep your options open as you find ways to bring your product to the masses. For those with a techy “smart” home device, feel free to check out Xfinity Home’s partner program to see if there are any opportunities to collaborate with Comcast.

This article was produced in partnership with Comcast.

The post Who are the right partners to get you into more smart homes? appeared first on ReadWrite.

]]>
Pexels
When you’re thinking IoT expansion, think horizontal https://readwrite.com/when-youre-thinking-iot-expansion-think-horizontal/ Tue, 12 Sep 2017 23:53:18 +0000 https://readwrite.com/?p=99271

In the next interview in our series on the IoT ecosystem, we tackle what enterprise market participants need to consider […]

The post When you’re thinking IoT expansion, think horizontal appeared first on ReadWrite.

]]>

In the next interview in our series on the IoT ecosystem, we tackle what enterprise market participants need to consider when using IoT to help build out their business value beyond a single application — and for that, you need to think hard about your horizontal platform.

We spoke with Nokia’s Jason Collins, vice president of IoT marketing, and Frank Ploumen, IoT new product introduction and strategy, to get their take on how you might be limiting your IoT value with short-term thinking of you’re not thinking horizontal.

Readwrite: So, Jason, define how you see scalability for our enterprise clients that you guys may be talking to already; think in terms of product expansion or extension.

Jason Collins: When most people think about scalability in the IoT space they would initially start thinking about the size of a deployment. In thinking about size they would take into consideration the problem they are trying to solve. For example, I want to measure temperature across a geographic landscape and I’m going to have X number of devices to do this. And that’s certainly one aspect of scalability, but more importantly, I think the first thing they should think about are the operational impacts of that kind of a deployment. If you’re going to deploy 20,000 devices and there is a security problem with those devices how do you update them, for example?

See also: If data is the new oil, who is your refiner?

And so scalability is also about operations. And then scalability is also across time. Nothing is ever static. Possible changes that may occur are: You may need to change a vendor out because they go out of business or a vendor becomes not suitable for something you want to do in the future. You may need new kinds of devices that support more features that you might be considering to improve your applications. So you need to support scalability across time, and across vendors, and you should think of scalability across efforts.

This last point needs a little more explanation. If you’re a company that has multiple IoT activities that you’re going to deploy over time, how do you ensure that by the time you get to the second, third and fourth business objective that requires a new and different deployment, that you’re actually leveraging what you did in that first deployment? When you take this into consideration, scalability is a big deal to ensure that your future deployments are less cumbersome and more efficient.

Frank Ploumen: I’ll have a few words. I want to particularly stress the point of multi-vendor interoperability. A lot of times when we talk to customers, the question that we get is, I can buy a solution from vendor XYZ that is turnkey and does everything from the application to the device, why would I want the hassle of this whole platform? And the discussion we usually get into is, ‘The last time I deployed a major platform or network, once it was finally deployed some things had changed causing higher hardware; higher integration costs; device OEM changes or all of the above.

So the cost of ownership over the whole lifetime heavily depends on the ability to mix and match hardware devices for many vendors to a single application, and we’ve become very used to this in other industries. If I’m talking about Wi-Fi for example, the fact that Wi-Fi on one side of the device needs to talk to a specific router on the other side doesn’t matter. Wi-Fi is Wi-Fi. Ethernet is Ethernet. Windows is Windows. So we’ve become very used to mix and matching anything. We all want the same mix and match capability in IoT solutions which require a horizontal platform capability. However, in the IoT space, we’re still stuck in very proprietary solutions where only certain permutations and combinations work.

RW: If that’s the case and obviously interoperability is the biggest challenge.  I mean I literally spent two hours yesterday, just talking to three different folks at this conference yesterday who were doing different mesh network related hardware deployments for a certain basket of industries. And we should talk about that, I can feel them pulling back and saying I can’t sort out everything about interoperability, I’m trying to cover this space, for me to be able to explain it to my client. So in your minds what does the ideal platform look like, given that…?

JC: That’s exactly why you have a layered model. As an example, look at something as simple and well-known as the OSI network model. The fact is that there are already standards around the Internet but the way it starts is that you actually have a lot of different connectivity technologies and different layers that hide the complexity and hide the lack of standards from the layer beneath.

So part of this is, yes we need to have standards out of the bottom end of this platform, but another consideration is, the platform needs to be able to adapt to the standards and proprietary protocols that are there, and also account for the standards that are coming up.

FP: Let’s review some of the key layers that come with an IoT solution.

At the bottom layer, you have devices or sensors, and they are the source of the data. Then you have networks, and network is a very broad term for local networks, long distance networks, wired, wireless, licensed or unlicensed and so on, or a combination of any of the above, but ultimately networks.

Then you have a layer where networks feed into a data mediation or brokerage layer.

Then at the top level, we have the stuff that we all know very well and that’s the applications, the dashboards, the very visible stuff.

A very important design goal when we talk about scalability and horizontal platforms is that we basically figured an application should be completely agnostic on what happens underneath. Data is data. Temperature is temperature. You, as an application developer should not have to worry where temperature came from, what protocol is used, what device generated it. Your application cares about temperature, that’s all you need to know. Creating this level of segmentation will make interoperability easier.

JC: And by the way to that point, if you look at something like IoT versus a generic Internet application, the IoT application is different… because of things like battery life and the size and kinds of data that you’re getting from the devices. Actually, what most people in this industry are used to dealing with which is IP because it’s been around forever…but IP doesn’t actually become a good way of accessing devices as you go lower in the stack.  So there’s a reinvigoration of the need to have an adaption layer because everybody is going to be analyzing and transporting this data using the IP layer and above once it hits the core network. Meanwhile, below that, when you get to the devices IP makes no sense because you’re running out of battery in a week.

FP: That’s a very interesting observation. The implication here is that the number of use cases in IoT is so vast, that it’s unrealistic to expect that there’s a one size fits all network typology type of solution. There will be many answers to many different things.

What I do on a battery constrained device that runs in the field (e.g. in oil and mining) versus a powered device used in the connected home is different from the physical layer point of view. But the application should not have to be tailored because otherwise, you keep redesigning and re-architecting applications for hybrid environments.

RW: You’re turning a hardware problem into a software problem

FP: Exactly. So if I build very simply let’s say an asset tracking application. In one city, I might be tracking over a cellular network. Why would I have to redesign that if I am deploying the same application in mining on a custom proprietary network. It’s the same application. Or maybe I have hybrids. I have networks that have some devices coming in over one network and then devices coming in over a new network. Like when we roll out 5G we will start rolling out devices. You don’t want to have to deal with applications that work in the old world and then different applications that work in the new world.

RW: A lot of what you just pointed out there is a story that I see over and over again. Some company has to do IoT strategy, so come let me explain it to you. And when I sit down with them and it’s a brand that you wouldn’t think of as being IoT right out of the box, it’s not one of the sort of pillars of IoT around communication or some aspect of big data or a specific hardware industrial, smart home marker, whoever, who you can see why they’d want to provide better connections or to create a network out of their core product.

You can see that they want to dig out the corner of the world that they can understand. if you were to say how do you build a horizontal platform, do they need to be told that, do they need to just be made aware that they already have one, how do you picture that conversation going? Obviously, it’s client specific, but do they really know what they have already?

You can see that they want to dig out the corner of the world that they can understand. if you were to say how do you build a horizontal platform, do they need to be told that, do they need to just be made aware that they already have one, how do you picture that conversation going? Obviously, it’s client specific, but do they really know what they have already?

What do you need as an enterprise client to build a horizontal platform? Do you just need to be made aware that you already have the beginnings of this and how to jump off from there?

JC: I would say the first thing is becoming convinced that you actually need a platform.  So you see a lot of people going out and building siloed applications which I think is fine as long as all you’re doing is an experiment. It’s when you actually stop and think about how this will answer a real business problem and positively impact your business longer term, is when you should think about a need for a platform that can exist for the next 10 years or more and expand to meet my needs. When you start thinking about scalability issues you quickly come to the conclusion that you shouldn’t be just haphazardly bringing up siloed applications. It is the equivalent of a simplified analogy that goes like this: “I need these two computers to communicate with each other, so therefore I’m going to a string a wire down the wall and then down the hallway.”

In the short term that could be fine, until you realize what you wanted to do is to provide everybody in the office with PCs on their desktops and be able to do email. Suddenly you think, ‘Well, maybe stringing a wire isn’t the best idea. Maybe the first idea is to hook up the first two executives to see if they use e-mail as an application, but not such a good idea for actually deploying email for an enterprise.

So convincing somebody that they need a platform is first, and then I would say that you probably don’t want to build your own platform because there’s a lot of platforms out there. I should be looking for what I can hook into that already exists that could give me a leg up. And I should start that conversation by thinking about the things we’ve been talking about.  Standards, support for existing proprietary stuff, data mediation: what’s going to make it easier for me to build applications on top of that? What’s going to make it easy for me to hook up my particular devices now and as we move forward into an unknown future?

It’s something that we haven’t talked about because I think we’ve been focused kind of on the data mediation piece of the whole problem, but device management is also really, really key to think about–and doing it in standards-based way so that as I increase the size and numbers and kinds of deployments, that I have a way of accessing these devices and doing updates and monitoring battery life in a way that is hidden from the application layer.

It’s something that we haven’t talked about because I think we’ve been focused kind of on the data mediation piece of the whole problem, but device management is also really, really key to think about–and doing it in standards-based way so that as I increase the size and numbers and kinds of deployments, that I have a way of accessing these devices and doing updates and monitoring battery life in a way that is hidden from the application layer.

FP: The examples that Jason gives are all examples that have to do with operations and the cost over the lifecycle of the service. First enterprises need to understand if they want to pursue IoT or not. Most people have already made that mental decision. And then they jump straight to the use case, like how awesome would it be if I can control or automate or analyze etc. and the operations often become an afterthought. Which is why the platform discussion doesn’t get a front row seat. We’re going to get started, we can control a few things, we hook up a few things, it looks really awesome. Like Jason’s analogy when we strung the two computers together we could get email and I’ll just roll it out to everybody.

If you don’t think about that operations problem up front, then you’re really going to get burnt down the road. And here’s where you see two different types of customers. The customers that either have been burnt before and start very carefully to delve into those operational questions. They’re very easily convinced they need a platform. And others are maybe a bit naive in operations and just focus on the use case per se and consider operations something that will come down the road. And that’s a very difficult conversation to have.

The second observation here is whether to build it ourselves or whether to buy it, and this is also not a new discussion. If you remember several years ago when a lot of embedded devices had their own custom operating systems and there was a lot of fragmentation in that area. The mindset of the developers often was ‘well I want to avoid paying for licenses, how hard could it be, I download some open source software to build my own version of what I need. We’ve all been there but at some point, you realize it can’t be just downloading an open distribution from Linux. The real value is in not having to maintain the lower layer itself. If it is done well by someone else you are more interoperable with the rest of the world which adds significant value.

RW: So that operations gap, I guess it’s been explained to me, I look at some of the Industrial concepts and I see people struggling that we are at a certain level of transparency and data awareness and usefulness. And we want to 2X that, so it’s like the same song but twice as fast. So they can string two computers together and that is functional but it is not ideal. And that pushback between the operations and the data management executive side of the discussion is a chasm that’s challenging them to bridge. Are these silos still too hard to knock down within an organization? Is it going to be a real challenge to have that internal discussion?

FP: Well you realize that the history in IoT is quite long. People might have deployed proprietary systems that are basically connected and seem to be very functional. The longer a system is in place without problems, the more afraid people get about dealing with it. When you’ve got let’s say an ancient building automation system from twelve years ago, that talks protocols that no one understands, but it works fantastic even though it should have been retired a long time ago.

RW: So we don’t want to open it up because we don’t know what will happen.

FP: The problem that comes is if you ever want to do anything more than what is designed for or for example connect new hardware to this platform, well good luck right? It’s a closed island and it is what it is. Now you’re stuck.

I’ve seen customers literally come to us, and one particular example a couple of years ago, a customer, I kid you not, told me that they had 37 different building automation systems that were gathering energy data from their real estate. And they could not come up with a way to aggregate and consolidate all of the data so that they could compare energy bills.

So you get into situations where nobody wants to touch it because it works. That’s one of the sad things about this old industrial stuff. It is really good at what it does, but you’ve lost all flexibility, and the opportunity to improve it and make it interact with other things in the world. As opposed to looking at the internet with anything connected to anything and which continues growing and picking up new data sources. I’m making a very extreme case here right. But look at something like Alexa in Amazon that we all know. Alexa connects to everything and anything and can control anything. It adds a few hundred different interfaces per month so it becomes richer and more powerful.

With Alexa, you’re tearing off a sheet and starting something new with being the leading voice commands platform,

What they’ve done is created a really good framework for integrating islands of data. Because otherwise, it wouldn’t be very good if Alexa could just do one thing, it would be a one trick pony.

The reason it’s so good is they’ve made it really easy to integrate Alexa against Philips lights, against thermostats, against you name it, so that’s the platform analogy power that you see there. If they hadn’t done that and they made it very narrowly defined, then it wouldn’t have had that power. The platform gives you that ability to very quickly and seamlessly integrate islands of data including those old legacy items that nobody wants to deal with.

Published in partnership with Nokia.

The post When you’re thinking IoT expansion, think horizontal appeared first on ReadWrite.

]]>
Pexels
Why data security is really everyone’s challenge today https://readwrite.com/nokia-security/ Thu, 27 Jul 2017 18:32:17 +0000 https://readwrite.com/?p=98966

The prevalence and proliferation of connected devices has undoubtedly improved efficiency in people’s lives, but the massive amounts of personal data […]

The post Why data security is really everyone’s challenge today appeared first on ReadWrite.

]]>

The prevalence and proliferation of connected devices has undoubtedly improved efficiency in people’s lives, but the massive amounts of personal data required to operate such devices has raised numerous safety and security concerns. We spoke with Gerald Reddig, Nokia’s head of security marketing, and Daisy Su, Nokia’s connected device platform marketing manager, to gain a better understanding of what’s happening in the IoT security landscape, and what Nokia is doing to ensure that customers’ data stays safe.

ReadWrite: The Internet of Things provides new ways to use services that are reliant on data and providing a platform in the cloud. So we kind of know that end users are going to have issues around data security. How do we overcome the customer’s fears regarding security? 

Gerald Reddig: One of the nice proof points for all of the initiatives that we started in Nokia has to do with the Mirai botnet attack — the biggest IoT attack ever.

This type of breach attacks internet or service providers; in the Mirai case, the service provider was hacked by IoT devices that were managed by neither the end user nor the manufacturer. This raised an important question in the IoT industry — should we secure the device itself or the data from the device, within the application server? The bottom line is that there is actually no single magic security bullet that can easily fix all the key IoT security issues. You need to attack the problem from different angles. 

There are a range of different issues to consider in IoT security. The first is IoT network security, which protects and secures the DNS or connected devices to backend systems on the Internet. Then there’s IoT authentication, which provides the ability for users to penetrate the IoT device and the management of overseeing the device. The third is encryption, or putting data in transit between IoT edge devices and backend systems. IoT public key infrastructure (PKI) typically originates from service providers and ensures that the radio access network (RAN) system provides digital certificates and cryptographic lifecycle capabilities. The fifth and biggest industry topic right now is IoT security analytics, which is process of collecting, aggregating, and monitoring all of the data.

These top five IoT security pieces are on Nokia’s radar to help security become more proactive, rather than simply reactive.  Nokia developed a security architecture for service providers and enterprises that helps to deploy the right balance between both proactive and reactive security.

RW: Where do devices fit into the security picture?

Daisy Su: When talking about security, we need to focus on end-to-end security, covering not only network connectivity and the applications in which the user data is being transported, but also the device itself. What we have learned and discovered is that many IoT devices behave similarly to mobile devices in terms of connecting to mobile networks, and we need to make sure that the device management lifecycle that we traditionally do for mobile is applied to the entire IoT as well. Here are a few common security questions related to mobile devices that are relevant to IoT:   

  • How do we authenticate devices to make sure that they have the correct identities and credentials to be allowed into the system without compromising the network? 
  • How do we apply access control to make sure that the right users and the right devices do only what they are supposed to do?
  • How do we ensure that the data from the devices is transported through a secure channel onto mobile networks so that it cannot be compromised tampered with?
  • How do we ensure data confidentiality, so that the intended receiver of the data is the only one who can read the data?
  • How do we ensure that we know the status and the availability of all the devices connecting to this network?

We also need to be able to generate secure passwords and allow future locking and wiping for IoT devices if they are compromised. It is essential that we be able to apply security fixes remotely and to neutralize the IoT security threat when vulnerability is detected.

Many IoT developers today have not focused strongly enough on how to secure the devices and connectivity to the networks. They have a general understanding on how to secure devices from the Internet point of view, but securing them on a mobile network involves very different knowledge, experience, and learning. There are a lot of back doors in IoT that people just don’t know how to close. Nokia has solutions to help both IoT service providers and mobile network operators track down and actively secure the vulnerable devices before, during, and after the attacks. We also provide a way to access millions of network connected devices, secure them and apply software update and security patches remotely. 

RW: What are some of the best practices, as we add millions of devices, in terms of deploying IoT networks?

DS: Managing network-connected devices starts with making sure that devices are certified according to industry standards and network operators’ specifications. At Nokia, we are helping service providers certify their mobile and IoT devices before on-boarding them to their network. For example, with our largest North American operators, we provide self-verification for device vendors to test their devices against the device protocols required. We also provide verification services for both network operators and device vendors to test and verify the devices with the end-to-end network use cases, making sure that they don’t compromise the network once they connect.

Once the device is certified, being able to connect the network to the proper on-boarding procedure is really important. The on-boarding procedure has to make sure that these devices are authorized and authenticated to connect to the network in real time.

But the complete device lifecycle management goes beyond certification and on-boarding. With Nokia Connected Device Platform, we can qualify the devices and detect new devices as soon as they attempt to connect to the network, thus authenticating and authorizing proper devices for access to the network. We can automatically and remotely activate, deactivate, and configure features and functionalities for the devices based on triggered policies and mobile network requirements. We can also provide maintenance functions, and identify and manage the flaws with the devices. Additionally, we can efficiently apply the most recent software and firmware updates onto millions of network-connected devices remotely. 

When devices need security updates, these can be burdensome tasks, but we at Nokia can provide and support security updates for the mobile service provider. With IoT, there are multiple device models and that are flooding the network, each of which supports multiple OS versions; every security update must be unique to a specific device model’s specific OS system.

So with millions of IoT devices connected to multiple networks, you have to figure out a way to update devices in the least amount of time and effort possible. You need a dynamic system to enable you to organize, analyze, and apply that firmware. At Nokia, we have successfully updated the security of more than 300 million mobile devices.

GR: What Daisy just described is incident prevention, incident detection, and incident mitigation. The second part, incident detection, is where the service providers play an important role with sophisticated machine learning analytics software. All of these big data techniques provide more predictive modeling for anomaly detection. 

RW: There are a lot of solutions out there, and Nokia has it’s own as well, but what’s unique about how you’re addressing attack prevention?

GR: Our end-to-end security portfolio, which is called Netguard Security, makes it simpler by cutting the security issue into three main blocks. Block one is endpoint security, which involves the encryption and authentication of end points and the detection of traffic anomalies. The second block is network security —  the most essential part and probably, from the market revenue perspective, the most relevant because it covers the perimeter protection against external attacks. Block three is security management, which helps reduce the response time of security teams and even automate parts of mitigation processes.  

Let’s use the Mirai botnet attack again as an example. Our threat intelligence center alerted our customer by providing guidance on how to react and implement new security policies, though in many of our networks, Mirai was not present at all. Still, we made sure that our customer was prepared in case they were attacked — that’s a critical part of security prevention. This kind of threat intelligence helps all customers implement preventative security, and with the even more sophisticated attacks we see on the cybersecurity horizon, you can’t be too prepared. 

RW: Is there a different approach for enterprise? How is Nokia dealing with this target?

GR: What comes to mind is my recent conversations with some enterprises at one of the trade shows in the critical communication world in Hong Kong — the question I always get is how I can make sure that the convergence that happens between information technology and operations technology does not create a disaster precipitated by a hacker attack. The typical nightmare scenario for all security people working in the utility industry is that someone could hack into the IT system and get across to the OT. We have also recently seen attacks involving advanced persistent threats, like in Ukraine, where hackers gained access to the power grid system and denied thousands of people electricity for a few days.

The critical question is not that there is a big difference between SP service providers and enterprises, but rather how to reduce the pain of the volume and the velocity of security data alerts. More than 90 percent of enterprises receive more than 150,000 security alerts a year. With only a small team, there’s no way to look to all of the alerts; our research found was that only 30 percent of security alerts are investigated.

This makes today’s technology landscape fertile ground for hackers. Target Inc., for example, has been hacked, and the hackers lurked inside the company’s network for months before they started exfiltrating the actual credit card data. Hackers are masters at waiting until the prime opportunity to strike presents itself; the average dwell time, the time that threat actors lingers in a victim’s environment until they are detected, in cyberspace is 146 days. Today, we know that hackers are beginning to compromise low-value assets capture the big fish — the high value assets. We must make the dwell time harder and shorter to make hacking itself harder. This requires new security management to reduce the alert noise and focus on the real threats.

Finally, we must shorten the time between detection and remediation. And that’s what Nokia developed. Our NetGuard security management centers are easy-to-use security operations, analytics, and reporting software solutions that enable operators to prevent, pinpoint, and address security threats before they result in breaches. It shrinks detection time by 80 percent, and accelerates recovery time by 75 percent and investigation time by more than 50 percent.

DS: Securely on-boarding network-connected devices is essential, regardless of whether the IoT devices are provided by the service provider or enterprise. If the IoT devices provided by the enterprise need to connect to the mobile network, the same device lifecycle management procedures described earlier are applicable onto all those enterprise IoT devices as well.

RW: What is the killer app for security on the horizon?

GR: That question makes it seem like there is a one-size-fits-all solution, but such a solution probably doesn’t exist. The same applies for cloud security and for smartphone security. Whenever we talk about security, all of the products and interlocking interfaces should be integrated so that we have a cohesive end-to-end solution that provides all of the unique capabilities help for our customers to address the evolving security threat. And that happens for mobile broadband, for IoT, for cloud, or for whatever the technological disruptions are prevalent at the time.

I’ve never heard of a killer app, but I think the right structure and strategy means from professional security to investigate where security holes exist, the right mix of security hardware and software deployments to prevent and detect security threats, and a mitigation system with a rapid response automation is essential. All three of those things help keep the balance between proactive and reactive security. Still, even that solution doesn’t work for everyone. 

RW: I kind of asked that question knowing that the answer was going to be no, but I wanted to know anyway.

DS: Basically, security is the job of everyone — the users, the software, every single network element, every device on the network, everything.

This article was produced in partnership with Nokia.

The post Why data security is really everyone’s challenge today appeared first on ReadWrite.

]]>
Pexels
5 key takeaways about entrepreneurship in large enterprises https://readwrite.com/five-rules-entrepreneurship-large-organizations-il1/ Fri, 14 Jul 2017 01:24:14 +0000 https://readwrite.com/?p=98838

The most successful large organizations never forget their startup roots. Instead, they find ways to preserve their entrepreneurial mindset, even […]

The post 5 key takeaways about entrepreneurship in large enterprises appeared first on ReadWrite.

]]>

The most successful large organizations never forget their startup roots. Instead, they find ways to preserve their entrepreneurial mindset, even in the face of growing scale and maturity. I’m honored to work for Comcast, which is such a company, surrounded by leaders who are driven by a collective passion to create innovative experiences that change customers’ relationship with technology.

As business leaders, we recognize that if we can create amazing customer experiences, then we will also build long-term value that grows the company. That growth is fueled by a healthy tension between optimizing existing, proven lines of business and creating exciting – but untested – new businesses.

While there are some common threads that connect entrepreneurs everywhere, being an effective entrepreneur within a mature organization requires a special set of skills and circumstances. Here are a few things I’ve learned from my experience:

#1: New business needs to be on the front burner

At its highest levels, a successful company needs to make new business creation a priority. It sounds obvious, but it must be clearly outlined that the goal of acquiring new business needs to be followed with actions and investment.

The first step is to identify an operating team whose full-time job is to develop new ideas and explore new market opportunities. That team should be held accountable for proposing a new portfolio of investments, and then be expected to lead any new businesses that are green-lit. They need to be focused on net new businesses rather than just line extensions and come up with solutions that leverage core corporate assets and provide material market advantage. In addition, they should be given the freedom to examine the data, assess technology trends, and gain user insights. In short, the operating team needs to be able to try and fail, and learn and then try again. Then, once one of those bets begins to pay off, it’s time to really lean in and invest.

#2: Balance grit with collaboration

Assembling the right operating team will always be a challenge. It’s not enough to identify talented individuals who are willing to zig when everybody else zags, so to speak. At the same time, an “us against the world” mentality may not work well in a large enterprise.

One should never understate the importance of grit, the will to make change happen, and the ability to forge partnerships both internally and externally. In founding a new stand-alone company, grit is critical, but it is also and especially needed to counteract the “this is the way it’s always been done here” mentality that creeps into established, more risk-averse enterprises.

Balancing sheer force of will with the ability to partner up is what separates the good from the great. The most successful teams that I have been lucky enough to be part of were not external hires but made up primarily of insiders. They had the courage to take risks and make change happen within a large organization.

#3: Freedom to fail (responsibly)

As this operating team executes on its new business charter, there are some basic protections and freedoms that need to be given. Independence from the core business priorities gives the team both flexibility and permission to challenge the status quo. The new team also must be liberated to explore creative solutions, challenge existing norms, fail responsibly (but quickly learn and adjust), and make independent decisions without the threat of penalty from the mother ship. This takes a lot of guts, trust, and leadership, but the payoff, in the end, can be huge.

#4: It takes time — and money

Rome wasn’t built in a day, and neither was Comcast. New business teams also need a sufficient amount of time, as well as financial independence, to turn their strategy into execution. As each quarter’s close approaches, it’s easy to prioritize core business lines at the expense of new business growth. But by giving your entrepreneurial teams a comfortable leash and financial independence, they’ll have the power and latitude to maintain a longer-term view of their new business regardless of other short-term factors. As long as they have full accountability for meeting their financial performance goals, then these new business groups should be given the resources and the time it can take for them to flourish.

#5: Full financial accountability

The biggest benefits of being an entrepreneur within a large enterprise are the assets you have access to. Large companies often rely on a shared services model. For example, some may have consolidated sales channels, marketing, customer care and support, and installation technicians. It’s easy to prop up a new business by allowing it to take advantage of these core services. However, in order to drive the proper operational behavior for this new business team, every expense, no matter how insignificant, needs to be accounted for. This helps create the “ownership mindset” that is aligned to the cultural norms of the already established business teams.

It may seem straightforward, but the day to day practice of the above, along with a willingness to invest for long-term growth, is surprisingly challenging. Nothing should ever get in the way of a good idea, no matter where it comes from. At Comcast, opportunities are always available to those who think differently about how to approach business. At the end of the day, we will stand behind those who see new opportunities, the will to make change happen, and an ability to partner internally and externally to bring our customers the best experiences.

Daniel Herscovici, the author of this article, is the SVP and GM of Xfinity Home. It was produced in partnership with Comcast.

The post 5 key takeaways about entrepreneurship in large enterprises appeared first on ReadWrite.

]]>
Pexels
If data is the new oil, who is your refiner? https://readwrite.com/if-data-is-the-new-oil-who-is-your-refiner/ Tue, 13 Jun 2017 08:41:29 +0000 https://readwrite.com/?p=98533

For enterprise teams, data seems to be everywhere, waiting to be unlocked to drive your business goals forward. We sat […]

The post If data is the new oil, who is your refiner? appeared first on ReadWrite.

]]>

For enterprise teams, data seems to be everywhere, waiting to be unlocked to drive your business goals forward. We sat down recently with two of Nokia’s leading IoT authorities — Marc Jadoul, IoT Market development director, Denny Lee, Head of Analytics Strategy — to talk about how your firm’s data could be oil that drives it forward.

ReadWrite: So this expression – “Data is the new oil” — is something I’ve heard bandied around at conferences and raised a few times. But the thing is, oil could be a fuel, and it could also be a lubricant, in your mind, with your clients, what does that mean?

Marc Jadoul: The way I look at it, is from a value point of view. If you compare the price of a barrel of crude oil with the price of a barrel of jet fuel, there’s quite some difference. Data, like oil, can and must go through a similar refinement process.

The more it’s refined, the more value it can provide because like fuel, it will support more sophisticated applications. Another way to think about this is like a pyramid – if you’re starting at the bottom of the pyramid, you are basically collecting raw data at the sensor level. At the next stage, you start to monitor this data and begin to discover what is included in it. You’re probably going to uncover some anomalies or trends and based on your analysis, you may uncover critical information that helps you create value for the company driving better decision making so-called Data Driven Decision Making (DDDM).

Then, if you do this decision making in a kind of learning phase based upon cognitive analytics you’re not only going to help make decisions but also predict behavior. Once you can predict behavior then you have gotten to the point of the most refined data, where the data is pure enough to be transformed into knowledge in order to help your machines and applications make autonomous decisions.

What I have described is a value chain where data is providing insight and knowledge to help companies make better decisions and ultimately automating some processes and decision making. I’m making the parallel with the oil industry, not as a metaphor for the lubricant function (laughs), but as compared to the refinement process. The more you refine it, the more it becomes useful and the more value you retrieve.

Denny Lee: When people use the new oil phrase I always think back to the 1970’s – when you control the oil, you control the economy. I think when one says “data is the new oil” it is rooted in this similarity. Data is the new oil also means that if you are able to take hold of that control, you can command that economy and your sector better.

When I hear that term, it also goes back to the idea that “data is the currency.” Data is quite raw in its form and people often use this term quite loosely. Some might think that data, insight, and intelligence are all referring to the same thing. But in fact, we actually make quite a distinction between these. Ultimately, we advocate that data is the raw ingredient and we want to process data that lead to insights. Insights and intelligence are what the business needs. I’m sure we will talk later on how to utilize this intelligence for actionable business purposes.

RW: So when you sit down with a client to discuss how to get them to envision a data-driven innovation within their organization, what’s the first thing that they need to know, the first thing that they should ask?

MJ: I think the first thing they need to do is to understand their own business and what are the challenges and problems that they want to solve. Instead of the contrary, trying to find a problem for their solution. Quoting Simon Sinek, one should start with the “why?” instead with the “how?” or the “what?” question.

DL: Business outcome is definitely one thing but before that you have to ask the question to whom you are speaking to in the organization. Each will have a different organizational boundary or realm of responsibility which will drive a different set of questions.

For example, if you are speaking to a CEO, his or her sand box is huge. On the other hand, you could be talking to a siloed part of the organization where their own universe is very defined. Then you need to understand their business context and their ultimate desired business outcome, You then work backwards and say “ok, what kind of data do you really have?”; and you try to connect the problem to a solution. Obviously when we are talking about the analytics context, it is about processing the data to the point in which it can drive their business outcome.

Then eventually we should talk about crossing organization boundaries. This is a very important point that we should not miss. Sometimes the nuggets of intelligence come only by breaking down the barriers between organizations.

RW: You’ve said in terms of the CEO that you’ve got a bigger sandbox to work in, but when I talk to other folks who are trying to implement a data driven solution of some kind around IoT, the idea of who the champion within an organization is often at the core of who really knows that challenges are within about organization, is there anything you can say about what a typical organizational champion would look like and how to orientate those goals across the organization?

DL: Well, in the IoT context, the organization can often be divided into two realms. The Operations Technology (OT) side and the Information Technology (IT) side. On the OT side your solution could be targeted at the person that controls the infrastructure for his or her company. Depending on the person you are speaking to within that group, they will have different needs.

Let’s take the customer who is focused on predictive maintenance as an example. In this case, he or she may only have budget to focus on maintenance and use big data and machine learning to support the maintenance cycle and to minimize machine outages. This is a very narrow use case with a specific objective. But if you talk to their manager, the scope and the context of the problem they are trying to solve is much broader and might cross organization boundaries

MJ: I really would like to complement this view with a look to a different part of the organization. Besides the leaders that need the analytics to make good decisions, I see the importance of the role of data analyst emerging in a number of organizations. These experts know how to deal with the data – or using the metaphor we used before: control the refinement process. We’re talking here about a different set of skills than the ones traditional IT people have. My educational background is computer sciene and 20 years ago, the basis of computer science education was mathematics. When I looked to the curriculum 5-10 years later, the emphasis had shifted towards algorithms and programming languages. Today, my son is doing his PhD in AI and, believe me, these students must have a very solid understanding of mathematics and statistics again. And let’s not forget that – as data scientists need to support enterprises’ business decisions – they also must have a good level of domain knowledge and business acumen.

RW: So it’s come full circle?

MJ: With most complex problems where you can’t just use raw computer data and number crunching to do something with the data. You really need the domain knowledge to know what’s meaningful and what’s not meaningful. And these are the people that are making it happen in organizations as they are in a support role to the internal decision makers as Denny described.

 

Big data pic_no title

RW: We see a lot of IoT solutions pitched around the massive amount of data you have or could analyze. So to a point, if you have that data knowledge in house that’s great, but if you don’t, is there a risk of overwhelming a client and offering too many data options, do they really need that talent in house?

MJ: It depends on kind of solutions you want to build, of course. And where you can do filtering and setting thresholds on some of the data, for example if you have a temperature sensor on a refrigeration installation, the only data that you actually want to get hold of are the exceptions or anomalies because if everything is normal there is no need to get overwhelmed by huge volumes of normal data. So what is important is that you do intelligent data collection and try to filter out, and pre-analyze and crunch the numbers as early as possible. To start the refinement process as close as possible to the device where the data is generated.

DL: Let me share with you a view of our thinking. This is applicable to IoT as well. In short, the way we look at data intelligence is similar to a human brain. We are actually driving a notion of intelligence stack. If you think about it in terms of your own brain, there are things that have faster response time and are more autonomous. At this layer, you are processing the environment data but with a narrow scope. Now let’s draw the similarity to IoT. Things are happening on their own and when it needs some feedback adjustments, it is making an autonomous, local decision.

In the next layer, there may be a moderate response time action and it is somewhat autonomous. And then there’s the upper layer that we call augmented intelligence. It serves to help the human; because at the utmost top layer it’s still the human administrator – the human executive making longer term policy changes. And that augmented layer is the top layer of the software where it’s uncovering hidden insights for the human to make better, different and longer term adjustments.

So if you think of these different layers as part of a stack, even if you think about it in an IoT context, say at a factory level: the closer you are to the bottom we’re talking in terms of robotics where things are automatic. And as you go up, it’s more human; and software plays a greater role in terms of discovering insights in order for the human to make better judgements.

MJ: What’s interesting is that this is also reflected at the infrastructure level. Probably you’ve heard of edge cloud or multi-access edge computing or MEC, where you are actually going to do part of the data processing as close as possible to the source. And it is for two reasons: First, you want to reduce the latency in the network, and reduce the turn-around time for decision making. Second, you don’t want to trombone all of these massive amounts of data through the core of your cloud. You only want your users and decision makers to deal with the real useful stuff. When I have to explain edge computing, I sometimes describe it as reverse CDN (content delivery network).

Take a look at what we did years ago when video on demand and live streaming became popular. We were suddenly confronted with the problem that we might not have enough bandwidth to serve each user with an individual stream, and with a possible latency. So, we put caching servers closer to the end-user on which we would put the most popular content and could do some local content navigation and processing, such as fast forwarding and backward, and content adaptation. So this was downstream storage and compute resources optimization. And today we have a number of players on the internet, for example Akamai, who are making good money with such caching and optimization services.

Now, if you look to the Internet of Things, the problem is not in terms of the amount of downstream data like in video but the challenge is in the number of data sources and in the volume of upstream data. Because you have a huge number of IoT devices generating a massive number of data records and what you are actually going to do is put in some kind of upstream caching service that is close to the source to collect the data, do some low level analytics and make sure that you only send information that makes sense further down the cloud for further processing and further refinement, to use the oil industry metaphor once again. And therefore I call edge computing often a kind of “reverse CDN” for it’s supplying the same kind of functions but using a different architecture and operating on flows in a different direction.

RW: OK, so we’ve got someone who wants to invest in a project of whatever kind, typically someone’s got a cost savings or a new revenue stream I guess, but I think more often that not, it seems a go/no-go decision is most often driven by cost reduction or efficiency — which always has appeal in most organizations. Can you both give an example of a data driven process the can unlock not only the cost savings but maybe the decision pathway as well, like an example each?

MJ: I could start with what we are doing with our video analytics solution. This is an example of an application that uses massive volumes of data streamed by e.g. closed circuit video surveillance cameras.

In cities you have hundreds or thousands of these cameras that are creating huge numbers of live video streams. Generally, there is not enough staff to look at all the screens simultaneously, because it would be extremely expensive and inefficient to have people watch all these video streams 24/7. So, what Nokia’s solution does is analyze these videos and look for anomalies. There are plenty use case examples, like a car driving in the wrong direction, turmoil in an airport, some people or objects making unusual movements. What we’re actually doing is collecting these video data and putting it through the refinement chain, processed through a number of algorithms that recognize specific situations and detect anomalies. Adding AI capabilities to it, the system becomes self-learning and can identify, alert and predict any sort of “happening” that is out of the ordinary. This is helping decision making but at the same time it’s also an enormous cost saving because cities and security firms need only a fraction of the people. Analytics technologies are actually making these kinds of video surveillance solutions possible and affordable.

RW: Right, human eyes are not very scalable.

MJ: Right, human eyes are not very scalable and probably 99.99% of this CCTV video content doesn’t need attention. So you need to learn to filter the data as close as possible the source and only continue working with what is relevant.

DL: So Trevor, I’ll also give you a few sets of examples. The first group would be ones for accelerating resolution faster: such as predictive maintenance, “Next Best Action,” under the realm of predictive care for recommending workflow actions to care agent, and automated root cause analysis. These example use cases were previously done manually. You wait for some faults to occur and then you look into it. With automation and prediction; instead, some machine learning solution can predict potential fault occurrence ahead of time and you can minimize expensive maintenance action for fixing the problem after the fact.

Another set of examples is under the category of customer centricity with the use of artificial intelligence. Many customers are interested in this topic because at the end of the day they recognize that their competition is also trying to appease their end customers as best as they can. And whoever can do that best wins the day. So appreciating and understanding the customer experience and being able to predict that and respond to their needswould be an important aspect of big data analytics solution. For example in the context of networking solution providers and operators, knowing ahead of time that a congestion is going to happen and reacting to it, would be important. Maybe having a well-managed, butdegraded, performance is better than not having any services at all in certain circumstances. So getting ahead of the problem with customer centricity is also a form of AI application – understand their experience and then acting accordingly. The third one, I would say is the augmented reality use cases which appeals to the higher level executive and the policy owners of the operators of the IoT enterprise.

Another class of problems would fit under the category of “optimization.” If you look at a set of business outcomes, you can set up the problem as an optimization problem: these are my sandboxes, here is my raw data and my KPIs and that is what I want to optimize as goals. The system can then be set up to optimize it. This is related to the point where one has the opportunity to break down organizational silos and optimize certain outcomes that are previously undiscoverable when the organizations are siloed. Such type of intelligence appeals more to the executive and policy owners of the organizations.

This article as produced in partnership with Nokia. It is part of a series of articles where the team from Nokia will be providing expert advice and delve further into data analytics, security and IoT platforms.

The post If data is the new oil, who is your refiner? appeared first on ReadWrite.

]]>
Pexels
What are the key drivers to successful enterprise IoT development? https://readwrite.com/what-are-the-key-drivers-to-successful-enterprise-iot-development/ Tue, 09 May 2017 02:52:40 +0000 https://readwrite.com/?p=98031

As IoT moves beyond pure data collection how do companies stay on top of this market and pull the most […]

The post What are the key drivers to successful enterprise IoT development? appeared first on ReadWrite.

]]>

As IoT moves beyond pure data collection how do companies stay on top of this market and pull the most value from it?

We sat down recently with three IoT authorities from Nokia: Khamis Abulgubein, IoT market development for automotive and transportation; Lee L’Esperance, IoT business modeling and Jacques Vermeulen, Nokia global Smart City business development.

They outlined ten key considerations when looking to extract the maximum value from Enterprise IoT.

RW:  We have discussed why companies aren’t getting value out of the IoT yet because we are in “the connecting things” phase. What are the additional drivers for unleashing the value of the IoT ? 

Lee L’Esperance:  Industrial IoT is really where things started first and we still seem to be in that phase of connecting siloed things and collecting data from them. However, we’re starting to see companies linking the silos horizontally, creating a horizontal ecosystem approach that brings partners, ideas, and new business opportunities together.

Combining information sets in this way will provide additional value.  Next, we should start seeing companies and government entities expand toward what we call Enterprise IoT.  In this phase, you will start to see new business models emerge through this horizontal approach to analyzing the data. Also, enterprises will look more towards collaboration to solve unique problems. Let’s take a city as an example. In order for IoT to take off in a city, a whole ecosystem is needed to help solve unique problems and explore new business models. 

At Nokia we’ve seen — through our ng Connect IoT community —  some unique business models take off, such as connected bus shelters and 4K video streaming applications that would have never been uncovered without an ecosystem approach that includes multiple partners from various industries and disciplines.  I haven’t observed other companies who have implemented such a broad ecosystem program of this nature.

Jacques Vermeulen: From a practical viewpoint, Nokia has seen success through a horizontal, secure, open, real-time and scalable solution for Smart Cities. You need to maximize value and functionality with a horizontal network approach.

These solutions are combined with vertical insertion points addressing specific use cases to support cities. For example, governments worldwide are starting to understand the need for horizontal approaches and are transitioning their approach from brownfield siloed Smart City deployments to horizontal network infrastructures and starting from scratch – more of a greenfield approach. To elaborate, every major city has a large shopping mall.  These malls are loaded with features such as:  security and access control, building management systems to optimize heating, ventilation and air conditioning resources and location based services for the visitors to provide them with coupons and incentives based on their profiles from previous experiences in the mall. 

And that is just a starting point.  Looking at the needs more broadly, you can discover very interesting use cases combining these elements.  Think about an emergency situation at a mall where there may be a need to deal with things like the people traffic within and coming to the mall.   

This would require warning solutions for people commuting, both by private and public means of transport.  These systems should then be interconnected with traffic control to prioritize the situation for the emergency coordinators coming to the site. They also could leverage security systems for real-time video and photographs of the situation at hand, and be proactively alerted to what is going on. A value added solution like this only comes to fruition if you look at a city in a holistic way and match the right IoT infrastructure to that. 

From my perspective, in an emergency situation, I would go even further.  I’d like for them to have the data to know my exact location and a portion of my medical records accessible in case I needed special care.  Once we get into the territory of e-health, this supports the argument for a horizontal approach even more because now we have a more meaningful solution combining data from different verticals versus taking a silo approach. 

Khamis Abulgubein: IoT suppliers need to work within a cohesive system as Lee has described. The only way you will be successful is if you collaborate and build an ecosystem, you have to build for the whole customer experience. This approach lends itself more towards Enterprise IoT where you will see business models offering a whole new customer experience and new services that would not have been possible in the past. 

This is the notion of “servitization” in which manufacturers provide a holistic experience to the customer and by selling usage based services rather than just engaging in a single transaction through a sale of a physical product.  An example would be a washing machine that is sold bundled with usage monitoring and proactive maintenance.     

Another example is the Connected Rental Car experience which is an interesting business model we developed with Hertz, SAP and other partners. On the surface, we are providing a premium service for business travelers. Along with that, we are also using the rental car as a passive payment platform for use cases such as parking garages, fuel pumps, and quick service restaurants.

This approach provides transaction fee revenue share into the service which helps which helps pay for the upgraded rental car personalization and connectivity service and platform. 

RW: We are collecting a lot more data these days, but is that all that is needed for enterprises to become smarter?

LL: It’s not the data it’s the analytics. Simply collecting data doesn’t make you smart.  You can be collecting a lot of data, but if you don’t do the right things with it, there is no value to it. The data needs to become actionable information. 

Analytics is the key to this, especially over time as analytics become better and artificial intelligence gets its entry into enterprises. For example, let’s look at an application where you are using sensors to collect data and measure weather conditions; having the weather data is only one part of the picture. 

Now, if you combine the weather data with other measured data and analytics, you can begin to predict things.  That is how we get smarter. When you take that weather data even further and mix it with data from traffic sensors to predict how the weather will impact traffic for example, the value of the information that may be obtainable will you get closer to that 36 x the value of the internet today as quoted by Bell Labs in the article Enterprise IoT – the best is yet to come.  This number becomes theoretically possible when you collect, analyze and look across all of the data to unlock the value.

KA: I also went through this in my lab. It is all about getting smarter and thinking about different ways to use the data. But you need to read between the data lines for new opportunity. We installed temperature sensors in my data center to see if there were temperature fluctuations in my lab. After a week we didn’t really pay attention to it anymore — that is, until we had an air conditioner issue. The next thing I knew I was getting proper alerts and further the system got smarter and sensed that there was going to be an air conditioner failure soon – in about an hour. I was able to call the technician prior to the failure actually happening and saved my servers from being impacted. 

Another example could be car tracking. Step 1 – you can see where your car is. Step 2 towards a smart solution – I’m alerted when it isn’t at my house when it should be.  Despite the endless possibilities, the challenge is with the fragmentation on getting devices connected to a network.

JV: In fact, Nokia is tackling this. Our IoT can help make sense out of different verticals and the fragmentation. In a smart city complex, there are not always engineers sitting at the buttons to make sense of the data on their dashboard. We are making complex data, data queries and analytics algorithms as simple as possible, so that people do not have to have special knowledge in order to do something logical and useful with their data. The sum of the parts is more than the whole.

More and more native language patterns are understood to translate in more complex technical queries and automate the process of making sense and identifying opportunities from the analytics.

RW: It sounds like this “servitization” would wring a lot of efficiency out of current product-customer relationships, and thus could be seen as a threat to manufacturers. How would you recommend they embrace this trend?

KA: I think there are a number of things contributing to that. As the price of components comes down, more is being put into products.  There are shorter product lifecycles and software is controlling a lot of things which requires continual updating.  Also, enterprises are looking for more flexibility. 

It’s better when we share; one major trend I’ve seen is that of not owning but sharing instead, or paying for use. An example of this is car sharing which is happening in many cities today — look at ZipCar, Turo, Enterprise Car Share, Hertz. 

Customers who only need a car from time to time can sign up for the service and pay for use on a daily/hourly basis – combine this with self-driving technology and this may significantly change the auto manufacturing and auto buying markets. 

LL: You need to think OpEx-ish. Enterprises seem to be heading towards OpEx versus CapEx models in many industries. Servitization enables that trend by providing end to end solutions and new business models like pay per use, goods sharing, or even risk sharing. This can represent an opportunity for manufacturers to expand into new services and develop consistent revenue streams.

RW: When considering data collection and analysis, AI integrations, bots, machine learning capabilities, are companies running into the challenge of whether their own end customers are ready for this?  If you are an enterprise, how do you best prepare clients for this?

KA:  Use data to delight and not deluge — surprise them and delight them at the same time.  For example, with cold temperature logistics – like transporting milk, today the person delivering food might tell the grocery store that the milk made it on time and wasn’t spoiled.  But if they can give more detailed information such as Tuesday the temperature was a steady 40 degrees, and on Wednesday it stayed very close to the requested temperature range – ensuring shelf life of the product, then that would be impressive.    

Another example is from the connected rental car experience that I mentioned previously. Passive payment platforms that can communicate with gas pumps, parking garages, quick service restaurants etc. provides more convenience to customers, and rental car operators can actually attract loyal customers and bring in more revenue from these services over time. 

LL: You need to simplify and secure. Enterprises can show customers the value of their IoT services by keeping the process of using these services and the interaction with them simple and secure. Choosing the right platform that will allow a seamless experience for the end user is absolutely the key to this.

This article as produced in partnership with Nokia. It is part of a series of articles where the team from Nokia will be providing expert advice and delve further into data analytics, security and IoT platforms.  

The post What are the key drivers to successful enterprise IoT development? appeared first on ReadWrite.

]]>
Pexels
Does big data today keep the doctor away? https://readwrite.com/does-big-data-today-keep-doctor-away/ Tue, 02 May 2017 13:00:21 +0000 https://readwrite.com/?p=97860

There’s a small cadre of highly skilled big data professionals and doctors who are leveraging technology to help you live […]

The post Does big data today keep the doctor away? appeared first on ReadWrite.

]]>

There’s a small cadre of highly skilled big data professionals and doctors who are leveraging technology to help you live a longer, healthier life. Armed with mountains of government-funded genomic data sets along with mature and easily accessible analytics tools, these technicians and doctors are building apps, tools, and systems which can help you diagnose and treat illnesses ranging from common to catastrophic.

Leading that charge is Dexter Hadley, unique in that he is both an engineer and a doctor. Dexter runs the Hadley Lab – a big data laboratory at UCSF Health which develops tech to fight disease and promote health. The Hadley Lab has a mandate to derive value from the mountains of clinical data that UCSF continually generates.  With a research background in genomics and clinical training in pathology, Dexter likes to quip that he uses big data to practice medicine.”

We got a chance to ask Dexter about the innovations that are born at the intersection of technology and medicine and tell us about how the democratization of technology is really impacting people’s lives.

So first off, people are probably wondering why and how you became both a doctor AND an engineer?

I have always wanted to be a doctor, but my trajectory changed dramatically when I taught myself to program computers at the age of 10 years old. Since then, I have been obsessed with how to leverage computation to better facilitate medicine. That journey took me from an undergraduate education focused on computer programming to medical school at University of Pennsylvania where I earned a master’s degree in engineering, a Ph.D. in genomics, and an MD for good measure. 

Through stints practicing medicine in an internship in general surgery at Penn, and then later residency in pathology at Stanford, I developed a passion as a physician/scientist to integrate medicine and software engineering in order to improve the delivery of healthcare for doctors and their patients.

So, what does the Hadley Lab do and how do you contribute?

The Hadley Laboratory leverages big data to improve the practice of medicine and the delivery of healthcare.  Our work generates, annotates, and ultimately reasons over large and diverse data stores to better characterize disease. We develop state-of-the-art data-driven models of clinical intelligence that drive clinical applications to more precisely screen, diagnose, and manage disease. We integrate multiple large data stores to identify novel biomarkers and potential therapeutics for disease.

The end point of our work is rapid proofs of concept clinical trials in humans that translate into better patient outcomes and reduced morbidity and mortality across the disease spectrum.  “I’m an equal opportunity scientist.  I care less about the best disease I can study, but more about what disease I can study best– it’s all driven by the data.   

And what would you say is the present, future, and ideal state of R&D in this area?

At present, I think we are experiencing a continued renaissance of medicine that started with the initial sequencing of the human genome well over a decade ago. Now, we are finally in a position to actually quantify human health and disease in “precision medicine,” a fundamentally different approach to healthcare research and its delivery where our focus is on identifying and correcting individual patient differences rather than making broader generalizations. 

While genomics allows us to quantify our molecular self, I think the future is in leveraging all the technology at our fingertips today to better quantify our physical self. As the power of genomics lies in its objective ability to correlate with physical manifestations in the patient, the ideal state of R&D must involve data collection and analysis at both the molecular genotypic level and the more clinical phenotypic level of the patient. 

For instance, in the context of a health system, my research integrates large clinical data stores with state-of-the-art big data algorithms, smartphones, web and mobile applications, etc. to first discover and then deliver precision medicine to patients.

Sounds like a big part of that future is genomics?

Genomics is indeed the future, except it’s clearly more complicated than we initially thought.  Most doctors don’t sit around looking at their patient’s genomic data to develop treatment plans. However, some specialist doctors look at images all day long, such as radiologists and pathologists for instance. We have technology and algorithms today that allows us to build ‘apps’ that can help these specialists.

For instance, we are working on a mobile medical app for doctors and their patients to use smartphones to better screen for skin cancer. However, while digital health apps on smartphones represent a convenient screen for skin cancer, the actual diagnosis and subsequent management of skin cancer remains within the genomics realm.

So, diagnosis is where the need is right now?

The practice of medicine involves screening a general population and diagnosis of suspected cases before intervention on a specific patient. Much of precision medicine research has focused on diagnosis and intervention phases, with less focus on screening. My focus currently is using powerful big data algorithms for population screening of healthy individuals through digital apps. While “anybody” can build an app these days, not everybody has the knowledge, data, and access to the clinical infrastructure to develop clinical-grade algorithms for doctors and their patients.

How big of an impact is the “democratization of technology” having on this space?

About 6 years ago, Mark Andreessen penned a WSJ editorial that lays out the case for “Why Software Is Eating The World.” How does the average person shop today? Or bank? Or trade stocks? Or find a taxi? Mainly through innovative “apps” that we have come to depend on. I think that inevitably this phenomenon will percolate to our medical world where we now have all the ingredients to do magical things with tech, meaning cheap computation, awesome algorithms, and tons of big data that we continue to generate at breakneck speeds in clinical medicine.

For instance, at UCSF Health, we literally have billions of clinical records over almost a million patients that must hold the keys to practice better medicine. If you think about it, the average clinical trial to prove efficacy of an intervention is practically limited to the order of hundreds of patients because of time and monetary constraints. 

Therefore, our modern health systems allow for the largest clinical trials most appropriately powered for rapid discovery of novel medical interventions. I think that building clinical grade apps based on this big data allows us to immediately deliver the innovative discovery power of our health systems to the hands of physicians and their patients.

What would that involve, “building a clinical-grade app”?

Building the app is actually the least rigorous part of the process as the ‘clinical-grade’ performance comes from the algorithms that we develop that underlie the app interface. The magic of what we are doing lies in learning patterns from big data that we generate in healthcare. Deep learning is one such method that is a paradigm shift towards ‘cognitive computing’ where computers are essentially trained to think like humans. 

Deep learning on big data represents state-of-the-art machine learning today and repeatedly outperforms other more traditional methods. Data is the key piece of this process because these deep learning algorithms are incredibly complex. While much of statistics is based on linear models whose parameters can be accurately estimated with only a few data points, some of the most sophisticated deep learning algorithms have more parameters to estimate than there are atoms in the universe. 

Therefore, useful deep learning requires big data to accurately estimate parameters that are most predictive.

Let’s say one of our readers is interested and wants to develop this app for you, what would you share with them to help get them started?

I would definitely encourage them to reach out directly to me through my website. I’m also a member of the Institute for Computational Health Sciences at UCSF, which is dedicated to advancing computational health sciences in research, practice, and education in support of Precision Medicine for all.

If any readers are interested in contributing to the project, you can reach Dexter at dexter.hadley@ucsf.edu.

So do you think big data today will keep the doctor away? If you liked this article, read more stories about the data impact on the world at www.datamakespossible.com.

This article was produced in partnership with Western Digital.

The post Does big data today keep the doctor away? appeared first on ReadWrite.

]]>
Pexels
Is inventory management clouding your judgment? https://readwrite.com/inventory-management-clouding-judgment/ Fri, 28 Apr 2017 16:54:37 +0000 https://readwrite.com/?p=97784

Imagine a world where running lean meant the digitization of goods while using smart systems to manufacture end product next […]

The post Is inventory management clouding your judgment? appeared first on ReadWrite.

]]>

Imagine a world where running lean meant the digitization of goods while using smart systems to manufacture end product next to your consumers. Creating such a customer-centric model of production could change the way we view parts manufacturing in its entirety.

3D printing has crept into conversations regarding modern day business and has become more cost effective. Technological advancements have increased both speed and quality control, and in turn have opened the doors to a game-changing IoT solution.
One where businesses today, both small and large, can leverage the connected world to shift inventory control to the cloud.

“New age approaches to reducing overhead in an effort to free cash flow has accounted for several groundbreaking shifts in the corporate business world,” says Rosemary Turner, President at UPS. “Strategies like Digitizing Inventory enables corporations to serve every channel of commerce and handle Just-In-Time models at the same time.”

“Simply put, customer centricity drives business growth, and if you aren’t doing it today, you will soon.” she adds.

Inventory control is a capital sinkhole, one where your invested money sits idle for use when it otherwise could be spent on growing your business. The requisition of space and pick pack operations require time and a team to run them, which again requires capital. So what does the world look like if you weren’t forced to keep a back stock of parts and products for returns and sales? It looks leaner, more refined, and allows for more spending on growth activities like digital marketing campaigns and more.

A great new opportunity

Sounds awesome right? That’s because it is, yet many businesses don’t even know this opportunity exists.

UPS, SAP, and Fast Radius have all partnered to identify slow moving inventory, which typically requires injection molding for minimum quantity runs in order to make the production cost effective. Fast Radius, a 3D printer company out of the U.S., can assist in running low quantity parts in proximity to your end user.

Think DiDi meets demand manufacturing… A systematic approach to connecting the closest printer, with spec-driven capabilities, to localized requests for goods. The obvious downside is quality control. However, 3D printing has evolved to now have the ability to turn product typically in 1-2 days inclusive of QC requirements. With material limitations becoming a non-issue due to advancements in the printers themselves, there really isn’t a reason to not explore this model shift.

RT Headshot
Rosemary Turner, President, UPS

“Advances in 3D printing have enabled all sorts of solutions to issues facing slow moving inventory,” says Turner. “For example, printers today can actually print finished products with embedded circuitry and tech, so it doesn’t just have to be the case anymore.”

Large tech companies represent the early adopters of cloud-based inventory. Typically these companies are required to maintain service level agreements with their clients in order to handle repairs within a given timeframe. They leverage customer centric repair offerings and replacement parts as a method of selling more to high-end enterprise clientele.

“UPS today is worth over a petabyte of data. Imagine 400k file drawers stuck somewhere all accessible on demand,” Turner points out. “It is imperative that companies like UPS continue to embrace cloud-based solutions, as growth and market demand force you to do so. To not adopt can create obsolete.”

The second subset of companies to embrace this change is startups and small businesses. The ability to produce end product at customer locations saves both time and money, which helps offset the slightly higher production costs. The adoption of this model upstream also provides potential investors with a clear cut execution model that is built around scalability and capital alleviation.

“No matter who you are, you have access to the connected world, and as a business owner it is your responsibility to leverage tools which enable efficiency and growth,” says Turner. “Yet it is more than that, the marketplace today demands customer centricity, and it demands our undivided attention to detail on our engagements with our consumers every day. Digitization strategies are only the beginning.”

“The future of Just in Time manufacturing with zero-quantity thresholds and no setup fees is fast approaching, and we here at UPS are poised to enable our customers to capitalize on these opportunities,” she adds.

Come join us

Ultimately, the connected world drives customer centricity. Competition drives all of us to think differently. Here at UPS we are looking for innovators to continue to move industries into the future. We welcome you to take on the challenge of “Simplifying Logistics” by participating in our upcoming Hackathon.

To kick things off, we are partnering with ReadWrite for an official launch party on May 16th. Join us for this exclusive networking opportunity with peers and industry experts to make connections, learn, network, and work together to increase the speed, efficiency and performance of the supply chain to improve the customer experience.  RSVP here.

This article as produced in partnership with UPS.

 

The post Is inventory management clouding your judgment? appeared first on ReadWrite.

]]>
Pexels
For your big data, sometimes there’s no place like home https://readwrite.com/big-data-sometimes-theres-no-place-like-home/ Wed, 26 Apr 2017 03:13:21 +0000 https://readwrite.com/?p=94969

As the IoT becomes more widespread, companies are coming to the realization that although IoT stands for “Internet of Things,” […]

The post For your big data, sometimes there’s no place like home appeared first on ReadWrite.

]]>

As the IoT becomes more widespread, companies are coming to the realization that although IoT stands for “Internet of Things,” the reality is that these solutions are less about the things and more about using the data generated from these things.

As the volume of data that these solutions deliver grows, it is challenging traditional ways of reporting and investigating this data; these methods were never designed with the intent of showing users what’s in the data, but rather to enable self-guided exploration of data.

The problem is even more complex in industrial IoT (IIoT), where billions of data points can be generated every second from manufacturing systems and factory floors. This really presents an opportunity — not just to have data, but to generate actionable intelligence and deep insights from this dataflow.

This is where machine learning has started to make its presence known. But with all this data and all these new analytical tools, the new question that companies deploying IoT solutions becomes “where do these analytics now live?”

Here’s the truth: data no longer lives as a static object. The days of fixed data existing at rest have been replaced with time series-driven data streams that changes its state and is constantly in motion.

Think of the “Three Vs”

Think of data today across three attributes — Volume, Velocity, and Variety. With these in mind, we can start to look at where it is most appropriate for certain data to be ingested, processed, and delivered to take action on this data.

The first place where we see this new approach to data is at the edge with the emergence of edge analytics. For many applications, driving data all the way back to the cloud to be aggregated is neither timely, inexpensive, nor secure. Being able to turn your data around at the edge or on-premises allows for more efficient deployment of solutions to monitor data streams in real time for patterns and anomalies. These can then drive more intelligence business solutions that include such things as automated predictions, efficiency ratings, and time to failure analysis.

This reality is the reason behind ThingWorx Analytics, which gives IIoT solution builders key capabilities to tackle the volume, velocity, and variety of industrial data. It positions companies so they can leverage their data and build incremental business models without having to staff up dramatically to support this. Although the idea of data is not new, the role of a data scientist within a company is changing, and quite dramatically as well.

The risk of not getting the edge right

If you poorly deploy your technologies at the edge, it can turn your data scientists into what’s effectively an expensive professional services organization, turning data modeling into a laborious process driven by human performance rather than the timeliness of your data.

But in business, the right data delivered at the wrong time is still a suboptimal outcome.

The answer to this is a suite of products that can help your team quickly use modeling tools, then take those models and build automation around them to keep your data moving at the same speed as your business.

ThingWorx Analytics does this in several steps, designed to automate tasks for data scientists that were previously manual and time-consuming. This accelerates their ability to construct and deploy automated advanced analytical capabilities within solutions. The key to this is context and understanding where your data exists when applied to your internal business processes and use cases.

Without this contextual information, you cannot find the actionable data you need to make informed, proactive decisions. If there is a gap between data and action in IIoT, what bridges that gap is context. This is where we can finally have that meaningful discussion about the use of machine learning within industrial applications.

It’s important to point out that IIoT analytics look very different than previous generations of business intelligence (BI) tools. Most of the previous BI tools are still more business-assistive, providing better tools for a human to traverse data sets and build dashboards. They rely on humans to discover insights within the data. Machine learning can assist in the investigation of data often resulting in more insights from deeper and wider data sets.

New tools allow for pattern watching

The emergence of these autonomous learning technologies means an own approach. With autonomous learning you can track data streams in real time to watch patterns and anomalies. You can move past basic analysis to drive predictions and optimizations. And you can alter business or operational process in real-time to maximize a benefit or minimize risk.

In reality, even teams of humans can never deliver what machine learning can in terms of new models. And as a result, BI is trying to be used in places it wasn’t really developed for. In certain operational areas of a business, there is value still for traditional BI solutions, but they are not built for IoT at their core. And one of the advantages that define solutions such as ThingWorx Analytics is years of IP development in machine learning technologies.

If they’re done right, machine learning can improve the impact of your data scientists, decrease the time to market for new analytics, and increase the availability of those analytics to more internal teams beyond just the data team.

This represents a unique opportunity for companies to benefit from their data quickly and in a cost-effective way. But in order to do so, efficiency needs to improve. And it turns out that even with new technologies like IoT, BI, and the cloud, when it comes to data sometimes there’s no place like home.

This article was produced in partnership with PTC. Learn more about how the Thingworx Analytics platforms works and receive important updates.

The post For your big data, sometimes there’s no place like home appeared first on ReadWrite.

]]>
Pexels
IoT Analytics report: The journey towards successful IoT solutions https://readwrite.com/journey-towards-successful-iot-solutions/ Tue, 25 Apr 2017 02:57:14 +0000 https://readwrite.com/?p=97594 IoT solutions

The IoT Analytics team recently published an infographic highlighting the 5-step path towards successful implementation of IoT solutions. The insights […]

The post IoT Analytics report: The journey towards successful IoT solutions appeared first on ReadWrite.

]]>
IoT solutions

The IoT Analytics team recently published an infographic highlighting the 5-step path towards successful implementation of IoT solutions. The insights are based on more than 30 expert interviews in IoT as well as recent research.

For more information, you may refer to the Guide to IoT Solution Development.

In this 31-page guide, you will find:

  • A benchmark of eight major IoT vendors along 15 components of an IoT solution
  • Key learnings from current IoT projects
  • Three deep dives on crucial IoT aspects like security, interconnectivity, and manageability

You can download the infographic here.

 

Journey Towards Successful IoT Projects - preview

This post was produced in partnership with IoT Analytics.

The post IoT Analytics report: The journey towards successful IoT solutions appeared first on ReadWrite.

]]>
Pexels
With enterprise IoT, the best is yet to come https://readwrite.com/enterprise-iot-the-best-is-yet-to-come/ Sat, 15 Apr 2017 00:04:08 +0000 https://readwrite.com/?p=97294

A lot has been said about the Internet of Things (IoT) — a broad development in various technologies across industries […]

The post With enterprise IoT, the best is yet to come appeared first on ReadWrite.

]]>

A lot has been said about the Internet of Things (IoT) — a broad development in various technologies across industries that is fundamentally changing the innovation cycle everywhere —  but how much is real?

“One of the things that we should grasp about the IoT is that we are currently in that stage when technology gets incredibly hyped,” says Jason Collins, Vice President of IoT Marketing at Nokia.

Continuing further, he describes the hype by comparing the early days of the internet when static pages and hyperlinks did not ignite the full potential of the internet and the crazy boom and predictable bust over web-based businesses came and went.  Still, the world was left with the valuable piece of network and services infrastructure we now call the Internet.

“So the internet turned out to be kind of a big deal,” says Collins half-jokingly, before getting serious about how we can size the Internet of Things.

Keeping that outcome in mind, what is the potential size of the Internet of Things and how do we value it?

Valuing the Internet is tough.  Today we are going online with our computers and smartphones and connecting billions of nodes; however, the value will extend well beyond that. 

“Members of our Bell Labs team analyzed this and determined that it will be 36x the value of today’s Internet,“ he says. “That potential value of the IoT is dependent upon the number of devices connected and users’ perceived and experienced value of IoT devices and applications.”

If you think about that potential, we quickly recognize that we’re in the very early stages of how this connected technology can change the very fundamentals of digital transformation and business growth in the next decades.

How can enterprises leverage this growth opportunity?

Prior to the dawn of this new machine-type (M2M) connectivity, there were two main drivers of business — developing products and services and the sales of those products and services.

But this approach is now getting a major upgrade thanks to IoT technology. Key to this pivotal transformation is the data being produced in torrents by the connected devices that are expanding rapidly across businesses.

“But while this new connected world seems to be allowing enterprises and their customers alike to benefit from a huge pool of data, it’s not as simple as that,” says Marc Jadoul, Market Development Director in IoT at Nokia.

Perhaps it’s best to think of this in terms of “analog to digital.” Machines and networks that learn about their effective behavior through gathering data, and analyzing how to use them.   He explained further that we should think of the IoT beyond an environment of communicating things and instead as a “connective tissue” or a “global nervous system” that provides context, and why not? meaning.  This is the first step towards getting value out of the IoT.

Building upon that, the IoT then provides a “platform to solve problems” like the Internet once did via search and discovery. “Platforms like Google not only gave us access to the information but provided context,” says Jadoul. “In that same sense, Uber has provided a disruptive model for public transport and Airbnb a new platform for guest housing.  They use connectivity and data to transform business models today and, eventually, you will see the IoT becoming an innovation platform in many other areas, like connected cars, digital healthcare, or smart homes.”  The possibilities are endless because big data and new services will be driving the growth. 

Wireless sensor networks are evolving into analytics-enabled applications, making IoT into a “bigger and richer experience than the current M2M,” says Jadoul.

However, digital transformation must go beyond the platform, the data, and the (still too often) siloed applications. It requires a shift in the culture and mindset of organizations in order to generate significant benefit from this technology.

Who’s leading the growth within an enterprise?

New innovation found in M2M often came from internally focused and driven cost savings and process optimization efforts, a.k.a. command and control.  This is what we often call the Industrial IoT, or Industry 4.0.

While the early days of the IIoT were focused on these drivers, a new emerging Enterprise IoT approach will enable greater growth through product and service innovation, and yet-unseen business models.  With that in mind, it isn’t surprising that the early enthusiasts of this new technology are not only on the traditional IT side of the corporate “houses,” but also in their product management ranks, the people who face the customers and are looking for portfolio innovation, an enhanced customer experience, and of course new revenue opportunities.

“The sooner that companies start seeing IoT as a catalyst for growth rather than a way for the IT guys to trim costs, the faster IoT will get off the ground in enterprises,” says Jadoul.

Where is IoT headed?

As connected technology matures and a shift in mindset occurs, IoT will create new value for its stakeholders.

“Companies have to start looking at solving business problems and extend their thinking beyond vertical, point applications,” says Lee L’Esperance, Business Modeling Principal at Nokia.”If they remain strictly verticalized, its siloed and the value is limited.” But seeing the benefit across traditional business groups, products and services will unlock true value, he adds.

IoT can be very impactful to business but it needs to be architected for creating a connective tissue rather establishing than point-to-point links.  Motivating the ability to architect an IoT solution within a business context is about getting the business models right – and finding the sweet spots for creating value, growth, and RoI.  We will explore developing business models in the next article and how you can create new value opportunities for your stakeholders.

This article was produced in partnership wth Nokia.

The post With enterprise IoT, the best is yet to come appeared first on ReadWrite.

]]>
Pexels
Looking along the horizon for the “smart” sea change in IoT https://readwrite.com/the-smart-sea-change-in-iot-cl4/ Wed, 12 Apr 2017 23:08:11 +0000 https://readwrite.com/?p=97192

We’ve all been inundated with the hype surrounding Internet of Things “smart stuff” and the impending arrival of our robot overlords, […]

The post Looking along the horizon for the “smart” sea change in IoT appeared first on ReadWrite.

]]>

We’ve all been inundated with the hype surrounding Internet of Things “smart stuff and the impending arrival of our robot overlords, so we tend to minimize the mind-blowing wonder of the responsive and intelligent computing metamorphosis that is upon us.

For years, the IoT community has been saying that if we really want “things” to be of value, they cannot be dumb. The first wave was getting everything connected, and we have made headway there. The next step is to actually make “things” smarter. 

There are a variety of commercial solutions that do not really deliver on the promise of automating our way to more productive lives. And the concerns over properly securing our connected things still weigh heavily. But there really have been transformative leaps in computing capability and achievable functionality. The killer use case for IoT is on the horizon, but before defining what that is and describing how it is going to manifest, I think it’s important to broadly identify how we got here.

The “Trinity”

The impact of the open source movement in driving exponential leaps in technological advancement cannot be minimized. The algorithms and computing infrastructure that drive “smart” things — IoT, Artificial Intelligence, and machine learning capabilities — have been around for decades. Anyone at the NSA can tell you as much.

The difference now is in accessibility to the masses. These technologies were once jealously guarded, closed off from the wider world, and only available within formidable institutions possessing vast resources in both personnel and compute power. Open source changed all that. New things no longer have to be constructed from ground zero, thus supercharging the innovation cycle. The widespread access to knowledge bases and software allows anyone so inclined to build upon the shoulders of giants and leverage the wisdom of crowds.

The creative explosion fueled by open source helped give rise to the cloud, which is the second movement responsible for ushering in our new era of computing. Freed from the physical limitations and expense of individual server stacks and on-premise storage, the “app for everything” age dawned and the capacity for on-demand collection and consumption of big data was unleashed. Once we could scale compute power unconstrained by geography, our technology became mobile and the dream of smaller and increasingly powerful devices trafficking in colossal quantities of information became a reality.

Big data gives lifeblood to modern computing. But data does not do anything and, in itself, has no value. This brings us to the third movement in the “smart” revolution: analytics. The types of augmented computing that people encounter in everyday life now — voice recognition, image recognition, self-driving and driver-assisting cars — are founded in concepts that rose out of analytics and the pursuit of predictive analytics models, which was all the rage just a few short years ago.

The disheartening realization with predictive analytics was that, to train effective models, you need both massive amounts of data and scores of data scientists to continually build and maintain and improve data models. We were once again running up against the roadblocks of access and resource constraint.

And so we arrive at the present, where things are shifting in a new direction. The difference now is that we do not need to recruit an army of data scientists to build models; we have taught our programs to remove some of those roadblocks for themselves.

Inherent intelligence

Our AI-driven systems, especially Deep Learning systems, can now be fed millions upon millions of training sets, train in days/hours, and continuously re-train as more data becomes available. Open source tools and cloud computing are still important and evolving, and we still traffic in loads of data to perform lightning-fast analysis, but our programs now incorporate AI as the engine to make themselves “smarter.”

Expertise from vastly different computing realms has congealed to imbue programs with previously unimagined capabilities. The paradox is that as the cloud becomes ever more powerful and less expensive, the smart IoT strategy is to move much of the first line of entry processing away from the cloud and to the edge. This serves two purposes: to enable on-device decisions without needing cloud intervention and to deliver edge patterns and analytics to the cloud for fast second-stage analytics. Tiny AI engines can now perform analysis in near real time on edge devices and “things” no larger than a matchbook. And as these points of computational power grow increasingly commonplace in ordinary objects — intelligent routers and gateways, autonomous vehicles, real-time medical monitoring devices — their potential functionality expands exponentially.

John Crupi, Vice President and Engineering System Architect, Greenwave Systems
John Crupi, Vice President and Engineering System Architect, Greenwave Systems

Artificial intelligence at the edge

In the early days of IoT (aka M2M), the focus was on getting data up to the cloud when possible. FTPing log files every night was the rage. When General Electric came on the scene with the “industrial internet,” everyone began talking about real-time live data connectivity. That was a big jump from FTP, but people treated edge devices as simply “things” that transferred data to the cloud for analytics. We are now in the midst of an exponential reverse fan out of that thinking. Real-time requirements are redefining the paradigm. The cloud is now shifting into the role of IoT support and second-tier analytics, and the processing is getting pushed out to the edge.

For example, we have been working with a company developing a next-generation medical monitoring device. Initially, we assumed with such a small device, we would send raw data from the device to the cloud for analysis. But that is not what was desired, nor is it what transpired. The company wanted the analytics on the monitor. They wanted the analytics and pattern detection to occur directly on the device, to take actions on the device, and for only “intelligent” (as opposed to raw) data be sent up to the cloud. The model differed dramatically from standard industrial M2M operations — where everything would be connected, and batches of data coming in from all sources would be collected and processed on some set timeline at some central repository.

The whole purpose of connecting now is to obtain instantaneous precision results at the point of entry for immediate answers. Even the low latency involved in “traditional” cloud-processing with hundreds of thousands if not millions and billions of devices is not as efficient for real-time edge analytics as using this new architecture. In some cases, you can achieve a data reduction of 1,000x by just sending the analytics and patterns vs. raw data to the cloud. 

We no longer deal in dumb collection devices; we need them to do more than just curate. They must be artificially (and naturally) intelligent — capable of doing pattern recognition and analytics in their tiny engines. They push those results up to the cloud for other uses. As this ideal proliferates, so, too, do the possible applications.

As is perfectly embodied in the example of an autonomous car, this dual edge/cloud analytics model produces precision, real-time results that can be continually and automatically refined against ever-growing troves of more data, thus producing valuable, useable information and powering productive action. Even a year ago, I would have called B.S. on this notion for widespread IoT and AI integration — but edge computing and AI have really broken out of the lab and into our world. It will yield outcomes we have never seen before.

The killer use cases for IoT are manifesting through truly intelligent edge devices — in solutions that are purpose-built for specific problems or tasks, then interconnected and subjected to patterns that move beyond their initial application. As more and more smarter, AI-enabled “things” are incorporated into our everyday lives and operate at the edges of our inter-communicating networks, we will see things moving beyond merely being connected and into actively embodying intelligence. Smart stuff indeed.

This article is produced in partnership with Greenwave Systems.

The author is Vice President and Engineering System Architect at Greenwave Systems, where he guides development on the edge-based visual analytics and real-time pattern discovery environment AXON Predict. He has over 25 years of experience executing enterprise systems and advanced visual analytics solutions.

The post Looking along the horizon for the “smart” sea change in IoT appeared first on ReadWrite.

]]>
Pexels
Not all that glitters in IoT is gold… but, you can make money with the right ideas https://readwrite.com/not-all-that-glitters-in-iot-is-gold-but-you-can-make-money-with-the-right-ideas/ Tue, 28 Mar 2017 00:43:59 +0000 https://readwrite.com/?p=96682

Monetization, it’s a funny word– filled with a lot of meaning and adopted as a jargon term of late, especially […]

The post Not all that glitters in IoT is gold… but, you can make money with the right ideas appeared first on ReadWrite.

]]>

Monetization, it’s a funny word– filled with a lot of meaning and adopted as a jargon term of late, especially when applied to the tech world. This is particularly true for the Internet of Things (IoT) and, jargon or not, it won’t become a reality unless innovators find ways to make money with it. 

Most IoT initiatives are new ideas. The very concept is immature. If asked, different people will likely provide you a different definition of what the IoT is – from fitness trackers, to smartphone interactions at your favorite store, to industrial process monitoring.

It’s a lot, and it can make us a little more susceptible to chasing hype. It’s understandable, but folks making IoT devices, systems, and applications must look past the nifty things and focus on the pieces that can practically revolutionize how we interact with the world; things like industrial production and supply chain management.

That just leaves the question, “How does one make money in this new world of smart, connected devices?”

Let’s take a look at some realities of the IoT landscape and see what each means for those looking to capitalize on the potential.

It’s a new world for us all, sorta

The common theme for the different facets of IoT is the addition of smart, connected devices. This doesn’t mean the additions are limited to the most novel of new product ideas. There’s a lot of success to be found in adapting existing products for the IoT. This is especially an advantage with proven product lines perfected over years of use.

Here, the customer base and their market problems are well understood. Sales channels both exist and are probably pointed in the right direction. The foundation is already present; it’s just a matter of adding the right IoT functionality.

As an example, Schindler Elevator found great value in simply adding connected sensors to their elevators. The Schindler Digital program provides real-time data for field maintenance teams reducing repair costs and downtime. As an added benefit, it gives customers visibility to the status of their elevators. This was all accomplished without altering strategies like target customer or sales channel.  (Reference: https://www.t-systems.com/de/en/references/use-cases/use-case/schindler-internet-of-things-239450 )

In a nutshell:  Existing product lines have existing customers and sales channels. Don’t overlook using a smart-connected approach to make something good, better.

One word of caution: make sure the new add-on or refresh doesn’t alter the product or service so much that you must completely overhaul sales channels, support staffs, and infrastructure unless you are ready for these changes. That brings us to our next point…

New worlds can bring new challenges (and opportunities)

Fresh product scopes and game-changing new ideas often bring associated tasks, services, and add-ons that were not relevant or needed before. It can be daunting as these require new worker skills and infrastructure pieces to succeed. However, this transition also presents a great opportunity to take in new customers and generate new revenue streams from existing customers.

These new opportunities might include:

  • Installation and/or commissioning services
  • System integration consulting
  • Maintenance contracts – physical assets and software systems
  • Monitoring services
  • Service or data subscriptions
  • Automated secondary services

Most end customers don’t want to deal with the challenges of these new necessities but will find value in the insight and new functionality that connected products provide. They’re happy to pay product manufacturers to fulfill their needs from start to finish. Vendors can provide their customer additional items of real value by providing more than just the physical product.

For example, adding a wireless supply-chain monitoring system to a manufacturing facility provides the opportunity to sell the product along with installation and/or commissioning plus ongoing system maintenance services.  This can be in addition to a subscription model for access to the data, or advanced amenities such as offering the automated delivery of key supplies before they run out.

In a nutshell:  Don’t overlook the need for new infrastructure to execute your plan as a product vendor. Missing pieces can kill an otherwise great product or, when filled, enable revenue. These new pieces can be something you either DIY or you partner up to provide. 

Jonathan_Headshot_SM
Jonathan Heath, Sr. Product Marketing Manager, Synapse Wireless

It takes a village… of many different platforms to develop applications

IoT adoption has been slower than many initially thought. One cause is the number of new elements that product makers have to add to their systems to create truly useful, end-to-end products. It’s not easy, and it’s dealing with things they’ve had little experience doing before – cloud apps, data aggregation schemes, remote device management, and graphical interfaces. 

IoT applications are complex enough that few can, or should, endeavor to create everything – from the thing to the cloud. Many developers are avoiding these challenges by using an ecosystem of partnerships that leverages expertise in specific pieces of the IoT value chain.

For example, a leader in rodent control recently released an automated, wireless system. During development, they realized their expertise did not lie in understanding the complexities involved in an entire solution ; instead, they utilized tools and solutions from 3rd party vendors. The SNAP® Things Platform by Synapse was used as a means to network and program custom edge functionality while the Exosite Cloud Platform formed the tools for data aggregation, background admin, and user interfaces. If you’d like to learn more about this application, you can do so here: (https://exosite.com/casestudies/victor)

In a nutshell:  Don’t get stuck trying to do what someone else might already do much better. Focus on what makes your offering unique and differentiated. This gets you to market faster and allows you to evolve quicker. You have the joint innovation of all your partners – keenly focused on their core competencies.

Dream on! (But look for true value)

It’s ok to dream big with new technologies. That’s how we’ve gotten revolutionary jumps in innovation. (Cheers to you Philo Farnsworth, and not just for the hair.)

New tech gives the ability to do wild and crazy things once unimaginable. Many choose to jump at creating something truly unique in an attempt to make the next iPod or Nest thermostat. However, for each of these successful innovations there are hundreds, if not thousands, that fail. In many cases, the idea rocked – I’m looking at you, Apple Newton.  Successful products don’t have to be the glitziest, they just have to solve a customer’s real problems. When we start talking about the IoT, it usually comes down to the power of the data.

But, what data? Just because you can monitor something doesn’t mean you should, or that monitoring it constantly is best. You have to look for what provides valuable data for your customers. 

Let’s quickly look at ways solutions can provide real value beyond the most common goals of reducing costs, enabling a new task or lowering energy consumption:

  • Reduce Risk
    • Safety costs: IoT devices can provide a lot of help in not only collecting data about usage and movement of people, but wireless systems can actively participate in preventing dangerous situations.
    • Reduce downtime of key equipment: Ensure the right people know when a problem occurs, or when it might occur, and then assist in making the repair process go smoothly (diagnostics, automated parts replenishment, etc.).
    • Adherence to governmental or industry regulations: New policies and standards are pushing technology into new markets. IoT devices can form practical, cost saving means of adhering to regulations.
  • New Revenue Streams
    • Monitoring services: Either those that make the product or those that sell the product can offer services to provide context and actions.
    • Maintenance and Network management services: Offer services to watch over the network and fix issues as they occur or provide peace of mind that equipment will always be in working order
    • Add-on products and accessories: Even without direct integration, companies can offer additional solutions to start monitoring existing, deployed products.
  •   Build partner ecosystem
    • Just like Voltron, combining powers can lead to awesomeness.

The offerings of two companies can be united to create one valuable offering in the eyes of the customer.

Example — Smart Cities: As streetlights and infrastructure pieces are installed, other vendors will be able to add their point-solutions to the mix. However, it goes without saying (even if its printed here), the key to success is both parties receiving a portion of the earnings or the final combo boosting each other’s sales accordingly.

  • Sticky products and brand loyalty
    • You can deepen a customer relationship by adding the ability to access or control unconnected things customers deal with every day.
    • Provide a compelling reason to use your standard product more frequently.
    • Example: Selling an equipment monitoring device and service when you own the repair company (this can be a big scale too).
  • Utilizing the data to expand existing business models
    • For you as the product maker:
      • Know thine customer! It gives a great advantage to know how to access markets and how to provide impactful marketing.
      • The data can give insight into developing the next generation along with accessory/complementary products and services.
    • For partners:
      • 3rd party partners might find a different use for the data you collect.
  • Don’t overlook controlling things, too
    • IoT is not just about data collection. There can be a lot of value in having IoT devices take actions as well. This can happen in node-to-node, GW-to-node or Cloud-to-node.  The concept of distributed intelligence in action.

In a nutshell:  Just ‘cause you can, doesn’t mean you should. Take care to implement the things that bring real value to you and the customer. And don’t just think about monitoring things; the value might be in what that data allows you to control (for example, feedback from one sensor to shut down another machine). Bottom line, the initiative (product and/or service) doesn’t have to be revolutionary to make a big difference and thus make money.

What about security?

While it might initially seem like it doesn’t mesh with the other topics we’ve discussed, security is one very big reason the IoT has not been as quickly adopted as first expected. Even large corporations are trying to figure out what connecting their “things” to the internet can mean with regards to exposing both data and their devices to malicious intent.  IoT solution providers cannot ignore this. Products and services must have a great security story if they are going to be successful, and this story must be baked-in, not just an afterthought. It can be used as a selling point as customers are using this more and more as an upfront decision maker.

In a nutshell: Do not forget to factor security as an integrated piece of an offering as this can really impact adoption rates. Use a strong story as a selling point, but ignore it and risk getting passed over.

The IoT has certainly added complexity to the process of creating useful and successful solutions in a newly forming marketspace. But, the new capabilities it provides have excited the innovation imagination of product makers. Through some helpful collaboration and a focus on providing valuable things, there will be a number of revolutionary changes to how we interact with the day-to-day world around us.

The author Sr. Product Marketing Manager, Synapse Wireless. He has over 15 years’ experience in the design and production of networking equipment, with the last seven being concentrated solely on the Internet of Things. Starting out as a hardware and software design engineer for the telecom industry, Jonathan joined Synapse Wireless in 2008 and contributed to the development of the first generation of what would later become the SNAP IoT platform for Things.  

Currently, he serves as a Senior Product Marketing Manager, driving customer innovation by communicating the value of the SNAP Things Platform to the market and shaping its direction moving forward. Jonathan is the inventor of a patented networking software related to traffic prioritization and graduated first in his class from Mississippi State University with a degree in Electrical Engineering.

The post Not all that glitters in IoT is gold… but, you can make money with the right ideas appeared first on ReadWrite.

]]>
Pexels
Industrial IoT all set to turbocharge lean manufacturing https://readwrite.com/industrial-iot-set-turbocharge-lean-manufacturing/ Sat, 25 Mar 2017 00:10:20 +0000 https://readwrite.com/?p=96609

Companies in the manufacturing sector for years have been striving for lean production or processes to create more efficient operations. […]

The post Industrial IoT all set to turbocharge lean manufacturing appeared first on ReadWrite.

]]>

Companies in the manufacturing sector for years have been striving for lean production or processes to create more efficient operations. One of the latest trends in technology, the emergence of the Internet of Things (IoT), could give lean efforts a major boost.

Lean manufacturing, a systematic method for eliminating waste within a manufacturing system, is based on the concept of making obvious what adds value by reducing everything else. It’s a management philosophy that stems mainly from the Japanese manufacturing sector, and specifically Toyota Production System, which focuses on the reduction of waste to improve overall customer value.

Lean encompasses a set of tools that help in the identification and steady reduction of waste. And as waste is eliminated, quality improves and at the same time production time and cost are reduced. The ultimate goal of lean is to get the right things to the right place at the right time and in the right quantity, in order to achieve perfect workflow while minimizing waste and being flexible.

The Internet of Things involves the linking of physical objects such as devices, consumer products, vehicles, corporate assets, buildings and other “things” via the Internet. These “smart” objects are embedded with electronics, sensors, actuators, software and network connectivity that allow them to gather and share a variety of data and respond to control messages.

The IoT enables connected objects to be sensed and controlled remotely via an existing network infrastructure. This connectivity creates opportunities for a more direct integration of physical objects with digital systems. The potential benefits include increased efficiency, improved product development and enhanced customer service—to name a few.

The potential scope of IoT is enormous. Research firm Gartner Inc. has estimated that 6.4 billion connected things were in use worldwide in 2016, up 30% from 2015, and 5.5 million new things were being connected every day. The firm forecasts that the total number of connected things is forecast to reach 20.8 billion by 2020.

In the enterprise, Gartner considers two classes of connected things. One consists of generic or cross-industry devices used in multiple industries, and vertical-specific devices found in particular industries. Cross-industry devices include such items as connected light bulbs and building management systems.

The other class includes vertical-specific devices such as specialized equipment used in hospitals and tracking devices in container ships. Connected things for specialized use are the largest category, but this is quickly changing with the increased use of generic devices, Gartner says.

Taking lean to the next level

Within the context of building IoT-based manufacturing solutions, IoT opens up all kinds of possibilities, such as the ability to monitor the performance of products after they have been purchased to ensure adequate maintenance and customer satisfaction, optimizing supply chain logistics and streamlining the distribution chain. Information about product usage can be fed back to companies so that they can analyze the data to make improvements in design and production.

With this constant exchange of data, combined with the new automation technologies that are emerging and advancement in data analytics, manufacturers can achieve the dream of the truly “smart factory”.

IoT intersects with lean methodology and has the potential to take lean to the next level. The information gleaned from connected devices, including users’ experiences with a variety of products, can be fed back to instrumented factories to provide unprecedented opportunities to enhance manufacturing processes and reduce waste.

As consulting firm Deloitte has stated, “in operating the existing business, IoT and analytics are helping companies to connect a diverse set of assets. This results in efficiency gains throughout the manufacturing process.”

The firm describes a number of areas in which efficiencies can be added. One is through the acceleration of planning and pre-manufacturing. The processes of choosing suppliers, considering risk and managing material costs can be fine-tuned through the interconnectivity IoT and analytics bring, Deloitte says.

“Analytics can deliver insight to help companies gain a better understanding of customer preferences and desires, potentially resulting in improved predictability and performance in the marketplace,” Deloitte says. “Understanding the products, and the specific features, that are being purchased allows companies to plan production to meet market needs.”

Another potential benefit of IoT is streamlining the manufacturing process, which is changing dramatically as more companies incorporate IoT and analytics capabilities. “Predictive tools and machine learning allow potential problems to be identified and corrected before they occur,” the firm says. “The value of lean manufacturing and just-in-time processes like Kaizen and Kanban improves exponentially” when intelligence obtained via IoT and analytics can be applied.

And a third area where IoT can add value is in improving post-manufacturing support and service. In the past, Deloitte says, manufacturers often lost track of their products once they were sold. Now, because of new levels of connectedness and the greater insights provided by IoT and analytics, manufacturers can gather information from their customers effectively while improving service and support in the aftermarket.

The benefits of IoT for lean manufacturing extend well beyond processes within a single organization. IoT can help optimize the interaction of manufacturers and their business partners, enhancing the flow of materials along the pipeline based on more accurate data on product demand and usage. An IoT service creation and enrichment platform such as Accelerite Concert can go a long way in making such collaborations happen.

Manufacturers will be able to fully realize production efficiencies that were extremely difficult and in some cases impossible to achieve through traditional, manual processes.

Dean-Hamilton
Dean Hamilton, Senior Vice President and General Manager of the Service Creation Business Unit, Accelerite

The vital need for analytics

Organizations that successfully leverage the Internet, mobile technology, business analytics, digital performance dashboards, and integrate other enabling technology with strategic improvement end up with a much more advanced version of lean and continuous improvement in general, according to Terence Burton, president and CEO of The Center for Excellence in Operations Inc., a management consulting firm.

Enterprises “need a higher order paradigm of lean to benefit from these complex emerging technology-enabled innovations in business models, rather than suffer the inevitable waste creep and margin erosion,” Burton says. “The Internet of Things will undoubtedly play a large role in evolving lean to a higher order, enterprise-wide and technology-enabled paradigm of improvement.”

The potential benefits IoT can deliver for manufacturers stem from improved availability of timely and precise data. The ability to instrument, at low cost, almost every aspect of the manufacturing process and to deliver that data quickly to business stakeholders via the Internet is already transforming business operations and business models. But the promise of an evolved “higher order paradigm of lean” is entirely dependent on manfacturers’ ability to derive meaningful insight from data.

As valuable as IoT data can be for manufacturers’ lean efforts, it’s important for them to keep in mind that having enormous volumes of information will not necessarily be of help if they don’t have a timely and effective way of analyzing the meaning and context of the data.

Only advanced analytics and artificial intelligence (AI) technologies (such as machine learning and predictive maintenance), combined with the flexibility, processing and storage capabilities of cloud computing, will give manufacturers the ability to optimize IoT data and leverage it as part of their lean methodologies.

The smart factories of tomorrow will need to deploy a next-generation, cloud-based, big-data analytics platform that enables them to use newly acquired information to the fullest. The platform should be capable of analyzing structured as well as unstructured data, both at-rest (in databases) and in-flight (from streaming data sources) and include a single tool for data acquisition, storage, transformation, AI and visualization.

Manufacturers need to be able to drill down into IoT data via easy to understand dashboards, so they can find patterns and detect anomalies that can directly contribute to creating more lean operations. They need to be able to quickly identify useful correlations and make inferences that can lead to enhanced processes.

While business intelligence (BI) and data visualization tools are nothing new, current technologies often require the use of data analysts, BI developers and ETL developers before insight can be exposed to business users. The next generation of analytics tools, such as Accelerite ShareInsights will place more power in the hands of business owners and subject matter experts who fully understand the factory processes instead of data scientists and programmers. They also will be made accessible to factory operations teams and development teams, who can help provide an integrated flow of data to make products and processes more efficient.

Ultimately, the most significant transformation in how lean methodologies will be applied to smart factories will come from the use of AI to perform sophisticated forms of big data analysis that are impossible for human analysts. AI algorithms now drive semi-autonomous vehicles; recommend what we should watch on TV, read or listen to; recognize our speech patterns and faces; diagnose our illnesses and so much more.

These algorithms are not just capable of learning; they are also capable of detecting patterns, correlations and anomalies in large data sets that would go undetected by humans. They’re able to predict the behavior of complex, inter-connected systems and recommend the optimal course of action to accomplish a particular goal.

This type of capability will be especially important as manufacturers move toward product personalization, where products can be catered to specific users and predictive insight will be needed to configure production lines and supply chains in the most efficient manner.

The next generation of IoT analytics will place the power of AI directly in the hands of business stakeholder to drive continuous optimization. And AI-powered lean methodology will not simply be better at eliminating waste that inevitably creeps into complex systems; it will predict that waste before it occurs and take steps to ensure that it never does.

Manufacturing in the future will be about building the product the customer wants at just the right time, and together lean processes, IoT, big data analytics and AI will allow the smart factories of tomorrow to operate with unprecedented efficiency.

This article was produced in partnership with Accelerite. The author is Senior Vice President and General Manager of the Service Creation Business Unit at Accelerite.

The post Industrial IoT all set to turbocharge lean manufacturing appeared first on ReadWrite.

]]>
Pexels
Afero launches fast, low-cost IoT hub for the developer community https://readwrite.com/afero-launches-fast-low-cost-developers/ Thu, 02 Mar 2017 06:00:47 +0000 https://readwrite.com/?p=95826 afero

With the Internet of Things (IoT) market growing at a dizzying rate, companies require secure, scalable and flexible platforms to […]

The post Afero launches fast, low-cost IoT hub for the developer community appeared first on ReadWrite.

]]>
afero

With the Internet of Things (IoT) market growing at a dizzying rate, companies require secure, scalable and flexible platforms to maximize the opportunity from connected technology.

Enter the Afero Platform, which aims to help firms take advantage of the exploding potential of IoT. The platform’s key strengths are its low price and its reliable and secure ecosystem of software, hardware, and cloud services that enables businesses to build better IoT products and applications.

Now the firm is launching a new addition to its platform geared towards the developer community. The Afero Developer Hub provides support for an inexpensive local, standalone hub.

As a result of developer feedback, the new standalone hub was developed for those companies that are moving beyond the initial proof-of-concept phase of development with the platform and on to implementation.

The Developer Hub provides Afero Hub functionality on inexpensive off-the-shelf hardware. The Afero Hub Software is available at the no cost and its preferred Raspberry Pi 3 platform is inexpensive, popular and plentiful. This allows any project to be accessible remotely from anywhere in the world.

Prior to the introduction of the Afero Developer Hub, connections between IoT devices and cloud were made through the mobile app or the Afero Secure Hub, a standalone wireless gateway with a cellular connection to the internet.

However, not every project requires the power of the Afero Secure Hub, and mobile phone connections are not ideal for situations where constant connectivity is needed between a device and the cloud. So Afero took the next step in IoT development and made the platform even more flexible through the Afero Developer Hub.

ASR-1 is at the core

At the core of the Afero Platform is the tiny ASR-1 module. This module allows devices to communicate securely with the globally-distributed Afero Cloud which provides extensive product control and analytics.

With the new developer hub, businesses now can create a hub that connects ASR-1 equipped devices to the cloud using the Afero Hub Software package and any ARM-based computer running Debian Linux.

This provides a particular advantage for companies utilizing Raspberry Pi computers, which pack power and flexibility at a low cost.

The ubiquity of the Raspberry Pi platform makes the Developer Hub easy and cost-effective to implement and requires little development effort to deploy.

The “learning curve” required to install the software consists of basic Linux administration skills that most developers already know.

The new developer hub provides always available communication between ASR-1 devices and a remotely roaming mobile app. Multiple hubs can be used, allowing projects to seamlessly roam between them.

As well, new IoT projects that are launched on the Afero Platform will benefit from the Afero Developer Hub’s ability to facilitate prototyping of connected products.

The hub’s software is capable of creating a low-cost connection base with edge devices in scenarios where devices are scattered and connectivity is scarce.

And now with the Afero Developer Hub, companies will face fewer obstacles when moving from proof of concept into implementation, so that more visionary IoT innovations will be brought to life.

This article was produced in partnership with Afero.

The post Afero launches fast, low-cost IoT hub for the developer community appeared first on ReadWrite.

]]>
Pexels