Posts

Showing posts from October, 2019

Sensors and Machine Learning: Glucose Monitoring with An AI Edge

Image
Medtronic’s mission is to alleviate pain, restore health, and extend life through the application of biomedical engineering, explains Elaine Gee , PhD, Senior Principal Algorithm Engineer specializing in Artificial Intelligence at Medtronic. It’s a mission Gee is well equipped for. With over 15 years’ experience in modeling, bioinformatics, and engineering, she drives machine learning algorithm development and analytics to support next-generation medical devices for diabetes management. On behalf of AI Trends , Ben Lakin, from Cambridge Innovation Institute, sat down with Gee to discuss her most recent focus: algorithm development related to glucose sensing to improve the accuracy and performance of continuous glucose monitoring devices, also known as CGMs. Elaine Gee, PhD, Senior Principal Algorithm Engineer at Medtronic Editor’s Note: Gee will be giving a featured presentation on Advancing Continuous Glucose Monitoring Sensor Development with Machine Learning at Sensors Sum

Collaborative Robots with AI are the Focus of MIT Researcher Julie Shah

Image
By John P. Desmond, AI Trends Editor “I work on making robots better teammates,” Julie Shah told attendees at 2019 AI World Conference & Expo in Boston. The MIT Associate Professor of Aeronautics and Astronautics described her work in a keynote talk entitled “Enhancing Human Capability with Intelligent Machine Teammates.” She said, “We’re trying to enhance human capability rather than replace humans.” Also Associate Dean of Social and Ethical Responsibilities of Computing at MIT, Shah directs the Interactive Robotics Group, which designs collaborative robot teammates that aim to enhance human capability. Prior to joining the MIT faculty, she worked at Boeing Research and Technology on robotics applications for aerospace manufacturing. About 1.6 million robots are operating in the world today, Shah said. However, the number of what could be defined as robots in US homes today number 30 million, including Roombas but not including Alexa, smart homes or autonomous cars. She wonder

AI World Conference & Expo Hosts Its First AI Data Science Hackathon

Image
By Benjamin Ross BOSTON—The AI World Conference & Expo featured its first ever AI Data Science Hackathon last week, which gave data scientists and developers from across the ecosystem the opportunity to solve real-world data challenges in applying artificial intelligence (AI) and machine learning. Over the span of three days, teams worked to improve pipelines, datasets, tools, and other projects from a wide range of disciplines. Two teams gave reports on their work to the AI World audience, one team focused on strategic planning powered by AI in the cloud, and the other working on a fractal AI model for versatility, speed, and efficiency. Team one—designated “AI-Driven Strategy”—discussed the benefits of strategic planning for businesses with the assistance of AI. “I believe two things about strategic planning in most organizations,” the team’s leader said during their report out. “[First,] it has the potential to give an organization a powerful competitive advantage, if it’s

Chaff Bugs and AI Autonomous Cars

Image
By Lance Eliot, the AI Trends Insider In the movie remake of the Thomas Crown Affair , the main character opts to go into an art museum to ostensibly steal a famous work of art, and does so attired in the manner of The Son of Man artwork (a man wearing a bowler hat and an overcoat). Spoiler alert, he arranges for dozens of other men to come into the museum dressed similarly as he, thus confounding the efforts by the waiting police that had been tipped that he would come there to commit his thievery. By having many men serving as decoys, he pulls off the effort and the police are exasperated at having to check the numerous decoys and yet are unable to nab him (he sneakily changes his clothes). This ploy was a clever use of deception. During World War II, there was the invention of chaff, which was also a form of deception. Radar had just emerged as a means to detect flying airplanes and therefore be able to try and more accurately shoot them down. The radar device would send a si

What’s Next For Robotics: In The Field, Inferencing On The Edge

Image
By Allison Proffitt Robots are a key application for AI and in addition to an excellent plenary talk by Julie Shah of MIT, a whole track was dedicated to AI in robotics applications. Dan Kara, VP of robotics and intelligent systems for WTWH Media, outlined some of the challenges in building robots—not chatbots, he clarified, but robots that act in the physical world. “It seems like every year it’s just around the corner,” he said, but this year the tailwinds are picking up. Robotics is the foundation for much of our work thus far in artificial intelligence and machine learning, Kara argued. “It’s only been fairly recently that you’ve started getting artificial intelligence or machine learning moving off into different labs,” he said. “At one time, they were considered the same thing because that’s where the work was going.” Early research in facial recognition, accelerometers, natural language processing, very small cameras, and more “came out of work done in robotics labs. They’re

How to Use AI to Detect Soft Skills

The new world we live in gives us more help — and more doubts. Machines stand behind everything — and the scope of this everything is only growing. To what extent can we trust such machines? We are used to relying on them in market trends, traffic management, maybe even in healthcare. Machines are now analysts, medical assistants, secretaries and teachers. Are they reliable enough to work as HRs? Psychologists? What can they tell about us? Let’s see how text analysis can analyze your soft skills and tell a potential employer whether you can join the team smoothly. Project Description In this project, we used text analysis techniques to analyze the soft skills of young men (aged 15–24) looking for career opportunities. What we had in mind was to perform a number of tests, or to choose the most effective one, to determine ground truth values. The tests we were experimenting with included: Mind Tools test — short and simple, but may be difficult for centennials. Test’s output i

Why Better Data User Experience Means Better ROI

It should be the least controversial idea in the world: when people have easy access to quality information, they’ll make better decisions. But for so many of the world’s businesses, this quality information lives behind a veil that can only be pierced by trained professionals who specialize in data science or other niche topics. The user experience of business data generally suffers for companies seeking to make an impact — employees must be able to interpret a complicated graph in order to arrive at actionable information, or be able to understand a number of complex acronyms or abbreviations. That’s why it’s likely that employees are leaning on domain experts in order to explain the nitty-gritty and provide the underlying analytics. This approach is less than totally productive because it doesn’t promote the worker autonomy that actualized businesses depend on in pursuit of a goal. What’s worse is that employees may also be dismissing the data altogether. This weakens an organiz

Neural Machine Translation

For centuries people have been dreaming of easier communication with foreigners. The idea to teach computers to translate human languages is probably as old as computers themselves. The first attempts to build such technology go back to the 1950s. However, the first decade of research failed to produce satisfactory results, and the idea of machine translation was forgotten until the late 1990s. At that time, the internet portal AltaVista launched a free online translation service called Babelfish — a system that became a forefather for a large family of similar services, including Google Translate. At present, modern machine translation system rely on Machine Learning and Deep Learning techniques to improve the output and probably tackle the issues of understanding context, tone, language registers and informal expressions. The techniques that were used until recently, including by Google Translate, were mainly statistical. Although quite effective for related languages, they tended

Why Digital Health Will Need Big Data to Support Its Infrastructure

Healthcare IT leaders must lead the charge to deploy big data analyses across the continuum of community medical services. Rapidly expanding medical needs and a whirlwind of technological innovations has created a mass of data that no healthcare organization can ever hope to manage manually, according to the American Medical Informatics Association (AMIA). Resultantly, a growing number of organizations recognize big data systems as a solution for maintaining information infrastructure. By compelling lawmakers to consider informatics when making decisions, IT leaders can ensure that the medical field meets goals for deploying big data technologies and improving community health outcomes. How Care Providers Use Big Data to Improve Public Health One example of how organizations use big data to improve public health outcomes is the United States Department of Veterans Affair’s deployment of analytics to improve treatment outcomes and well-being for veterans. In 2018, nearly 5 mil

6 Tips for Image Optimization for Data Visualization

Visuals are used to attract attention but also to provide information. Data visualization is a basic skill of data scientists, helping them tell the story behind the data.  The popularity of infographics and data visualizations on the Internet has grown exponentially in the last years. According to a report from Infographic World, visuals improve information retention by 400%. However, attractive images and graphics can be large in size and slow down your website. Therefore, it is crucial to optimize the images for web usage. Read on to learn more about image optimization for data visualization.  What Is Data Visualization? The term data visualization refers to the techniques and processes used to communicate information via visual content. Data scientists use graphs, charts, and infographics to convey information in a clear way. Presenting analytics visually often helps users understand complex concepts. Professionals often use data visualization to detect patterns in the data

How to Make AR, VR, and MR a Part of Your Digital Workplace

AR, VR, and MR have evolved greatly from sci-fi gadgets, gamer gimmicks, and entertainment devices, even surviving the Pokémon GO craze to be finally recognized as technologies bound to revolutionize the workplace. Currently, Ford, Boeing, Airbus, Coca Cola, Siemens, and hundreds of global companies outside the consumer entertainment industry are busy testing and implementing AR and VR gear in every field of their operational processes, from manufacturing to customer engagement.  Other large and mid-sized enterprises are expected to follow suit in the next five years. According to the IDC report, the five-year CAGR for AR and VR technologies will reach 78.3% by 2023. The largest investment share (80% by 2022) will come from the commercial sector companies, which are expected to allocate a lesser but still significant portion of funds to digital workplace transformation.  To meet the burgeoning need for AR, VR and MR technologies an impressive number of advanced hardware has hit the

Anti-fraud Analytics Shine as AI in Banking Grows

While the finance and banking sector has a reputation of maintaining a relatively conservative, cautious approach toward major disruptions, the current push for digital transformation, CX prioritization, and data-driven automation has been leading to massive changes. AI integration is one of them. Chatbots and CX augmentations may well be the headline-makers of today, and a likely cutting-edge tech priority of the future, but right now risk management and regulatory compliance are #1 when it comes to funding. According to this year’s AI in Banking report, risk, safety and compliance AI alone currently accounts for more than half of $3 billion investments, according to the Emerj 2019 report. Fraud detection and cybersecurity with AI in finance With a 26% share of the funds raised by AI vendors working in finance (Emerj), fraud protection and cybersecurity are the top current uses and adoption opportunities for risk-related AI and ML in banking and finance. Pwc’s Global Economic

Multi-Party Privacy and AI Autonomous Cars

Image
By Lance Eliot, the AI Trends Insider You are at a bar and a friend of yours takes a selfie that includes you in the picture. Turns out you’ve had a bit to drink and it’s not the most flattering of pictures. In fact, you look totally plastered. You are so hammered that you don’t even realize that your friend is taking the selfie and the next morning you don’t even remember there was a snapshot taken of the night’s efforts. About three days later, after becoming fully sober, you happen to look at the social media posts of your friend, and lo-and-behold there’s the picture, posted for her friends to see. In a semi-panic, you contact your friend and plead with the friend to remove the picture. The friend agrees to do so. Meanwhile, turns out that the friends of that person happened to capture the picture, and many of them thought it was so funny that they re-posted it in other venues. It’s now on Facebook, Instagram, Twitter, etc. You look so ridiculous that it has gone viral. Som

5G Will Require Us to Reimagine Cybersecurity

We are on the precipice of 5G, or fifth-generation wireless adoption. Many consider its development to be a technological race, one that countless organizations and countries are working to achieve. President Donald Trump explained this movement best: “The race to 5G is on, and America must win.” It doesn’t matter what country or region you’re from, just about everyone is focused on developing and upgrading wireless technologies to boost its capabilities. 5G is considered a revolutionary upgrade to mobile connectivity and wireless networks. It will push the entire technology field to new heights, affecting every industry, from manufacturing and retail to health care and finance. To understand why it’s going to be so impactful, one must consider what it offers. It’s also crucial to determine the cybersecurity and IT risks presented by these more robust networks. What 5G Networks Will Bring to the Table Nearly everyone is already familiar with wireless networks, particularly th

Quantum Computing and Blockchain: Facts and Myths

The biggest danger to Blockchain networks from quantum computing is its ability to break traditional encryption [3]. Google sent shockwaves around the internet when it was claimed, had built a quantum computer able to solve formerly impossible mathematical calculations–with some fearing crypto industry could be at risk [7]. Google states that its experiment is the first experimental challenge against the extended Church-Turing thesis — also known as computability thesis — which claims that traditional computers can effectively carry out any “reasonable” model of computation What is Quantum Computing? Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously [5]. A Comparison of Classical and Quantum Compu

How AI Powered Chatbots are Changing the Customer Experience

Chatbots have arrived. They’re no longer the domain of sci-fi movies or high-tech companies. They’ve gone mainstream. Last year, more than two-thirds of consumers report interacting with a Chatbot. People are embracing them, too.  40% of consumers said they don’t care whether it’s a Chatbot or human that helps them as long as they get what they need. 47% of consumers say they are open to the idea of buying products or services from Chatbots. AI-Powered Chatbots Natural Language Processing (NLP) and Artificial Intelligence (AI) are two of the main reasons Chatbots are becoming more accepted. Some of the more advanced AI-Fueled Chatbots make it difficult to know whether it’s a Chatbot or a real person. Machine Learning (ML) can improve over time as Chatbots analyze additional data to learn how to answer specific inquiries. They can handle relentless amounts of inquiries and can be programmed to recommend upsell opportunities. Most importantly, they take the burden off of overwo