Monday, 22 August 2016

Top 10 Machine Learning Algorithms

I will go over with one by one algorithm in series of posts and explain where to use each one of them including the Python program usage for it. 

  1. Naïve Bayes Classifier Algorithm
  2. K Means Clustering Algorithm
  3. Support Vector Machine Algorithm
  4. Apriori Algorithm
  5. Linear Regression
  6. Logistic Regression
  7. Artificial Neural Networks
  8. Random Forests
  9. Decision Trees
  10. Nearest Neighbours

Internet of things examples: 12 best uses of IoT in the enterprise

http://www.computerworlduk.com/galleries/cloud-computing/internet-of-things-best-business-enterprise-offerings-3626973/

10 IoT Startups

IoT is hot. And there is no shortage of companies in the space. We take a look at 10 IoT startups you need to know because they are doing something a little different or a little special.

http://www.informationweek.com/iot/10-iot-startups-you-need-to-know/d/d-id/1325235?image_number=2

Sunday, 21 August 2016

Gartner’s top 10 Internet of Things technologies for 2017 and 2018

Analyst firm says these technologies will enable organisations to unlock the full potential of the Internet of Things

Analyst firm Gartner has highlighted the top Internet of Things (IoT) technologies that should be on every organisation's radar through the next two years.
The technologies and principles of IoT will have a very broad impact on organisations, said Gartner, affecting business strategy, risk management and a wide range of technical areas such as architecture and network design.
"The IoT demands an extensive range of new technologies and skills that many organisations have yet to master," said Nick Jones, VP and distinguished analyst at Gartner. "A recurring theme in the IoT space is the immaturity of technologies and services and of the vendors providing them.
“Architecting for this immaturity and managing the risk it creates will be a key challenge for organisations exploiting the IoT. In many technology areas, lack of skills will also pose significant challenges."
Here are Gartner’s top ten IoT technologies for 2017 and 2018.

1. IoT security

The IoT introduces a wide range of new security risks and challenges to the IoT devices themselves, their platforms and operating systems, their communications, and even the systems to which they're connected.
Security technologies will be required to protect IoT devices and platforms from both information attacks and physical tampering, to encrypt their communications, and to address new challenges such as impersonating "things" or denial-of-sleep attacks that drain batteries.
IoT security will be complicated by the fact that many "things" use simple processors and operating systems that may not support sophisticated security approaches.
"Experienced IoT security specialists are scarce, and security solutions are currently fragmented and involve multiple vendors," said Jones. "New threats will emerge through 2021 as hackers find new ways to attack IoT devices and protocols, so long-lived ‘things’ may need updatable hardware and software to adapt during their life span." 

2. IoT analytics

IoT business models will exploit the information collected by "things" in many ways – for example, to understand customer behaviour, to deliver services, to improve products, and to identify and intercept business moments.
However, IoT demands new analytic approaches. New analytic tools and algorithms are needed now, but as data volumes increase through 2021, the needs of the IoT may diverge further from traditional analytics.

3. IoT device management

Long-lived nontrivial "things" will require management and monitoring. This includes device monitoring, firmware and software updates, diagnostics, crash analysis and reporting, physical management, and security management.
The IoT also brings new problems of scale to the management task. Tools must be capable of managing and monitoring thousands and perhaps even millions of devices.

4. Low-power, short-range IoT networks

Selecting a wireless network for an IoT device involves balancing many conflicting requirements, such as range, battery life, bandwidth, density, endpoint cost and operational cost.
Low-power, short-range networks will dominate wireless IoT connectivity through 2025, far outnumbering connections using wide-area IoT networks.
However, commercial and technical trade-offs mean that many solutions will coexist, with no single dominant winner and clusters emerging around certain technologies, applications and vendor ecosystems.

5. Low-power, wide-area networks

Traditional cellular networks don't deliver a good combination of technical features and operational cost for those IoT applications that need wide-area coverage combined with relatively low bandwidth, good battery life, low hardware and operating cost, and high connection density.
The long-term goal of a wide-area IoT network is to deliver data rates from hundreds of bits per second (bps) to tens of kilobits per second (kbps) with nationwide coverage, a battery life of up to ten years, an endpoint hardware cost of around $5, and support for hundreds of thousands of devices connected to a base station or its equivalent.
The first low-power wide-area networks (LPWANs) were based on proprietary technologies, but in the long term emerging standards such as Narrowband IoT (NB-IoT) will likely dominate this space.

6. IoT processors

The processors and architectures used by IoT devices define many of their capabilities, such as whether they are capable of strong security and encryption, power consumption, whether they are sophisticated enough to support an operating system, updatable firmware, and embedded device management agents.
As with all hardware design, there are complex trade-offs between features, hardware cost, software cost, software upgradability and so on. As a result, understanding the implications of processor choices will demand deep technical skills.

7. IoT operating systems

Traditional operating systems (OSs) such as Windows and iOS were not designed for IoT applications. They consume too much power, need fast processors, and in some cases, lack features such as guaranteed real-time response.
They also have too large a memory footprint for small devices and may not support the chips that IoT developers use. Consequently, a wide range of IoT-specific operating systems has been developed to suit many different hardware footprints and feature needs.

8. Event stream processing

Some IoT applications will generate extremely high data rates that must be analysed in real time. Systems creating tens of thousands of events per second are common, and millions of events per second can occur in some telecom and telemetry situations.
To address such requirements, distributed stream computing platforms have emerged. They typically use parallel architectures to process very high-rate data streams to perform tasks such as real-time analytics and pattern identification.

9. IoT platforms

IoT platforms bundle many of the infrastructure components of an IoT system into a single product. The services provided by such platforms fall into three main categories: low-level device control and operations such as communications, device monitoring and management, security, and firmware updates; IoT data acquisition, transformation and management; and IoT application development, including event-driven logic, application programming, visualisation, analytics and adapters to connect to enterprise systems.

10. IoT standards and ecosystems

Although ecosystems and standards aren't precisely technologies, most eventually materialise as application programming interfaces (APIs).
Standards and their associated APIs will be essential because IoT devices will need to interoperate and communicate, and many IoT business models will rely on sharing data between multiple devices and organisations.
Many IoT ecosystems will emerge, and commercial and technical battles between these ecosystems will dominate areas such as the smart home, the smart city and healthcare.
Organisations creating products may have to develop variants to support multiple standards or ecosystems and be prepared to update products during their life span as the standards evolve and new standards and related APIs emerge.
Source:
http://www.information-age.com/it-management/strategy-and-innovation/123460982/gartners-top-10-internet-things-technologies-2017-and-2018

The Growing Use of Artificial Intelligence - a HBR article




Artificial Intelligence Is Almost Ready for Business

MARCH 19, 2015

RECOMMENDED

MAR15_19_35565381
Artificial Intelligence (AI) is an idea that has oscillated through many hype cycles over many years, as scientists and sci-fi visionaries have declared the imminent arrival of thinking machines. But it seems we’re now at an actual tipping point. AI, expert systems, and business intelligence have been with us for decades, but this time the reality almost matches the rhetoric, driven by the exponential growth in technology capabilities (e.g., Moore’s Law), smarter analytics engines, and the surge in data.
Most people know the Big Data story by now: the proliferation of sensors (the “Internet of Things”) is accelerating exponential growth in “structured” data. And now on top of that explosion, we can also analyze “unstructured” data, such as text and video, to pick up information on customer sentiment. Companies have been using analytics to mine insights within this newly available data to drive efficiency and effectiveness. For example, companies can now use analytics to decide which sales representatives should get which leads, what time of day to contact a customer, and whether they should e-mail them, text them, or call them.
Such mining of digitized information has become more effective and powerful as more info is “tagged” and as analytics engines have gotten smarter. As Dario Gil, Director of Symbiotic Cognitive Systems at IBM Research, told me:
“Data is increasingly tagged and categorized on the Web – as people upload and use data they are also contributing to annotation through their comments and digital footprints. This annotated data is greatly facilitating the training of machine learning algorithms without demanding that the machine-learning experts manually catalogue and index the world. Thanks to computers with massive parallelism, we can use the equivalent of crowdsourcing to learn which algorithms create better answers. For example, when IBM’s Watson computerplayed ‘Jeopardy!,’ the system used hundreds of scoring engines, and all the hypotheses were fed through the different engines and scored in parallel. It then weighted the algorithms that did a better job to provide a final answer with precision and confidence.”
Beyond the Quants
Interestingly, for a long time, doing detailed analytics has been quite labor- and people-intensive. You need “quants,” the statistically savvy mathematicians and engineers who build models that make sense of the data. As Babson professor and analytics expert Tom Davenport explained to me, humans are traditionally necessary to create a hypothesis, identify relevant variables, build and run a model, and then iterate it. Quants can typically create one or two good models per week.
However, machine learning tools for quantitative data – perhaps the first line of AI – can create thousands of models a week. For example, in programmatic ad buying on the Web, computers decide which ads should run in which publishers’ locations. Massive volumes of digital ads and a never-ending flow of clickstream data depend on machine learning, not people, to decide which Web ads to place where. Firms likeDataXu use machine learning to generate up to 5,000 different models a week, making decisions in under 15 milliseconds, so that they can more accurately place ads that you are likely to click on.
Tom Davenport:
“I initially thought that AI and machine learning would be great for augmenting the productivity of human quants. One of the things human quants do, that machine learning doesn’t do, is to understand what goes into a model and to make sense of it. That’s important for convincing managers to act on analytical insights. For example, an early analytics insight at Osco Pharmacy uncovered that people who bought beer also bought diapers. But because this insight was counter-intuitive and discovered by a machine, they didn’t do anything with it. But now companies have needs for greater productivity than human quants can address or fathom. They have models with 50,000 variables. These systems are moving from augmenting humans to automating decisions.”
In business, the explosive growth of complex and time-sensitive data enables decisions that can give you a competitive advantage, but these decisions depend on analyzing at a speed, volume, and complexity that is too great for humans. AI is filling this gap as it becomes ingrained in the analytics technology infrastructure in industries like health care, financial services, and travel.
The Growing Use of AI
IBM is leading the integration of AI in industry. It has made a $1 billion investment in AI through the launch of its IBM Watson Group and has made many advancements and published research touting the rise of “cognitive computing” – the ability of computers like Watson to understand words (“natural language”), not just numbers. Rather than take the cutting edge capabilities developed in its research labs to market as a series of products, IBM has chosen to offer a platform of services under the Watson brand. It is working with an ecosystem of partners who are developing applications leveraging the dynamic learning and cloud computing capabilities of Watson.
The biggest application of Watson has been in health care. Watson excels in situations where you need to bridge between massive amounts of dynamic and complex text information (such as the constantly changing body of medical literature) and another mass of dynamic and complex text information (such as patient records  or genomic data), to generate and evaluate hypotheses. With training, Watson can provide recommendations for treatments for specific patients. Many prestigious academic medical centers, such as The Cleveland Clinic, The Mayo Clinic, MD Anderson, and Memorial Sloan-Kettering are working with IBM to develop systems that will help healthcare providers better understand patients’ diseases and recommend personalized courses of treatment. This has proven to be a challenging domain to automate and most of the projects are behind schedule.
Another large application area for AI is in financial services. Mike Adler, Global Financial Services Leader at The Watson Group, told me they have 45 clients working mostly on three applications: (1) a “digital virtual agent” that enables banks and insurance companies to engage their customers in a new, personalized way, (2) a “wealth advisor” that enables financial planning and wealth management, either for self-service or in combination with a financial advisor, and (3) risk and compliance management.
For example, USAA, the $20 billion provider of financial services to people that serve, or have served, in the United States military, is using Watson to help their members transition from the military to civilian life. Neff Hudson, vice president of emerging channels at USAA, told me, “We’re always looking to help our members, and there’s nothing more critical than helping the 150,000+ people leaving the military every year. Their financial security goes down when they leave the military. We’re trying to use a virtual agent to intervene to be more productive for them.” USAA also uses AI to enhance navigation on their popular mobile app. The Enhanced Virtual Assistant, or Eva, enables members to do 200 transactions by just talking, including transferring money and paying bills. “It makes search better and answers in a Siri-like voice. But this is a 1.0 version. Our next step is to create a virtual agent that is capable of learning. Most of our value is in moving money day-to-day for our members, but there are a lot of unique things we can do that happen less frequently with our 140 products. Our goal is to be our members’ personal financial agent for our full range of services.”
In addition to working with large, established companies, IBM is also providing Watson’s capabilities to startups. IBM has set aside $100 million for investments in startups. One of the startups that is leveraging Watson is WayBlazer, a new venture in travel planning that is led by Terry Jones, a founder of Travelocity and Kayak. He told me:
“I’ve spent my whole career in travel and IT. I started as a travel agent, and people would come in, and I’d send them a letter in a couple weeks with a plan for their trip. The Sabre reservation system made the process better by automating the channel between travel agents and travel providers. Then with Travelocity we connected travelers directly with travel providers through the Internet. Then with Kayak we moved up the chain again, providing offers across travel systems. Now with WayBlazer we have a system that deals with words. Nobody has helped people with a tool for dreaming and planning their travel. Our mission is to make it easy and give people several personalized answers to a complicated trip, rather than the millions of clues that search provides today. This new technology can take data out of all the silos and dark wells that companies don’t even know they have and use it to provide personalized service.”
What’s Next
As Moore’s Law marches on, we have more power in our smartphones than the most powerful supercomputers did 30 or 40 years ago. Ray Kurzweil has predicted that the computing power of a $4,000 computer will surpass that of a human brain in 2019 (20 quadrillion calculations per second). What does it all mean for the future of AI?
To get a sense, I talked to some venture capitalists, whose profession it is to keep their eyes and minds trained on the future. Mark Gorenberg, Managing Director at Zetta Venture Partners, which is focused on investing in analytics and data startups, told me, “AI historically was not ingrained in the technology structure. Now we’re able to build on top of ideas and infrastructure that didn’t exist before. We’ve gone through the change of Big Data. Now we’re adding machine learning. AI is not the be-all and end-all; it’s an embedded technology. It’s like taking an application and putting a brain into it, using machine learning. It’s the use of cognitive computing as part of an application.” Another veteran venture capitalist, Promod Haque, senior managing partner at Norwest Venture Partners, explained to me, “if you can have machines automate the correlations and build the models, you save labor and increase speed. With tools like Watson, lots of companies can do different kinds of analytics automatically.”
Manoj Saxena, former head of IBM’s Watson efforts and now a venture capitalist, believes that analytics is moving to the “cognitive cloud” where massive amounts of first- and third-party data will be fused to deliver real-time analysis and learning. Companies often find AI and analytics technology difficult to integrate, especially with the technology moving so fast; thus, he sees collaborations forming where companies will bring their people with domain knowledge, and emerging service providers will bring system and analytics people and technology. Cognitive Scale (a startup that Saxena has invested in) is one of the new service providers adding more intelligence into business processes and applications through a model they are calling “Cognitive Garages.” Using their “10-10-10 method” they deploy a cognitive cloud in 10 seconds, build a live app in 10 hours, and customize it using their client’s data in 10 days. Saxena told me that the company is growing extremely rapidly.
I’ve been tracking AI and expert systems for years. What is most striking now is its genuine integration as an important strategic accelerator of Big Data and analytics. Applications such as USAA’s Eva, healthcare systems using IBM’s Watson, and WayBlazer, among others, are having a huge impact and are showing the way to the next generation of AI.

Brad Power is a consultant who helps organizations that must make faster changes to their products, services, and systems in order to compete with start-ups and leading software companies.


Source: Harvard Business Review 
https://hbr.org/2015/03/artificial-intelligence-is-almost-ready-for-business