AI ML MarketPlace

  • How TD Ameritrade Uses AI To Hear The Voice Of The Customer
    By Thomas H. Davenport and Randy Bean TD Ameritrade has been garnering considerable public attention these days, much of it due to the high-profile television ad campaign known as “The Green Room.” The ads feature bearded actor Jim Conway calmly talking to investors including the young married couple, the busy mom, and singer Lionel Ritchie about their investment goals. The advertising campaign, which has been running since 2017 and was developed by the firm Havas New York, shows the “approachable Green Room guy” listening without judgment and engaging in discussion about finances and investing goals in “everyday language.” The ads suggest a focus on each customers’ financial situation and needs, and indeed “hearing the voice of the customer” is a long-term strategic focus for the firm. TD Ameritrade has embarked on a variety of ambitious analytics and AI initiatives in the industry to truly understand the voice of millions of customers, quickly respond to their needs, and tailor highly individualized investment recommendations. Analyzing the Voice of the Customer at TD Ameritrade The “voice of the customer” has long been a focus at TD Ameritrade. The company is an ardent follower of its net promoter score, and CEO Tim Hockey has declared that winning on the client experience is the firm’s top priority .   However, the actual voice of the customer has been an elusive entity—for TD Ameritrade and everyone else. Firms can pursue it through surveys and “customer journey” analyses, but it has always been difficult to know what the customer is thinking in detail. Call center conversations are, of course, the direct voice of the customer, but they are inevitably difficult to capture and analyze. The firm employs a large number of call center reps who handle millions of calls per year. Manually tracking the content of the calls is not feasible given the volume of calls handled. But AI can help. TD Ameritrade has applied artificial intelligence to analyze call center conversation in order to improve the client experience. Beaumont Vance, TD Ameritrade’s Director of AI, Chat and Emerging Technology has led a project to capture, analyze, and interpret a large sample of calls over the last 6 months, and the project is moving from pilot into production. The first step in the process is to convert speech into text with high accuracy, which is relatively straightforward these days. Then a Natural Language Processing (NLP) model reads through the transcripts, identifies topics mentioned on the call, and analyzes customer sentiment. The model’s analysis is then linked to the customer’s file with the company. The value of this combined information is that the actual voice of the customer can be linked to the record of activity and behaviors for that customer. For example, it could be determined  which web page a customer went to immediately prior to calling the call center. TD Ameritrade can understand this particular behavior by tracking their activity on the TD Ameritrade website (where on the website did they go, what did they do or try to do), what they said on the call, what actions were taken after the call, and what longer-term follow-ups were made. By understanding which web pages might not be meeting customer needs, they can improve those sites quickly and improve the customer experience. This is more efficient both for the customer and for the company.. The goal, of course, is to better serve customers by better understanding what drives their actions and sentiment.. Vance figures that there is 500% more information in the customer calls than in the traditional data upon which most companies rely to perform analytics. Assuming that TD Ameritrade can move beyond the successful pilot and analyze all customer call information, it could have an enormous positive impact understanding customer sentiment and behavior. As one example of achieving greater customer understanding, the company noticed from analysis of call text that many customers were calling to get information about the gains, losses, and cost basis of their investments. As a result, TD Ameritrade offered a new web page to customers that provides that information with a service called Gainskeeper. Calls and visits to offices for that information declined, and the company could be sure that its investment in the service was valued. Other AI Work at TD Ameritrade While the “voice of the customer” analysis is perhaps the most ambitious AI project at TD Ameritrade, there are a variety of other projects and products at the firm that make use of AI. Chatbots are in active use; Vance says that the company employs several popular chat platforms to better support its customers, such as Alexa, Facebook Messenger, Apple Business Chat, WeChat, Twitter and so forth. The goal of these projects is to better serve customers for straightforward issues. Like most such projects in other companies, the chatbot work at TD Ameritrade has delivered considerable value, but will do so even more as the technology matures. Consistent with the company’s interest in customer relationships, TD Ameritrade employs a tool to aggregate and analyze data about customer journeys. The tool allows understanding of just what customer channels and tasks were necessary for customers to accomplish their goals. Vance’s last job at TD Ameritrade was overseeing analytics for the retail segment of the company, and he’s a big advocate of automated machine learning as a way to deliver predictive models more efficiently and effectively. That technology is increasingly being used to meet internal needs for machine learning models. TD Ameritrade is also pursuing a variety of automation options, including robotic process automation. The potential efficiency advantages of these technologies are too great to ignore. However, the goal of this work, like the other projects we’ve described, is not to automate lots of jobs, but rather to improve customer service and free employees to do more creative and unstructured tasks TD Ameritrade "green room" commercial  TD AMERITRADE VIA YOUTUBE Providing the best customer service in the brokerage and personal investing industry is not something that can be done with only human support. Artificial intelligence is necessary to make sense of millions of customer interactions and to help customers make better financial decisions. At TD Ameritrade, AI is making “hearing the voice of the customer” much more than a catchphrase. Randy Bean is an industry thought-leader and author, and CEO of NewVantage Partners, a strategic advisory and management consulting firm which he founded in 2001.  He is a contributor to Forbes, Harvard Business Review, MIT Sloan Management Review, and The Wall Street Journal.  You can follow him at @RandyBeanNVP. Source: Forbes Read more »
  • I'm developing AI that can read emotions. It's not as creepy as it sounds
    Allowing computers to monitor and sense our emotions — rather than just track our everyday habits — seems creepy now. But as technology advances, consumers will grow to appreciate how artificial intelligence that can precisely gauge our thoughts and feelings will make our daily lives easier, with experiences that are more personalized, convenient and attuned to our emotions.   AI is already a big part of everyday life. For example, Starbucks uses AI in its rewards program and its mobile app to track a customer's orders, the time of day they place it, the weather and more to customize recommendations. Amazon revolutionized retail in part by using customers' previous purchases to make recommendations about other products.   These efforts are noteworthy, but they barely scratch the surface of how AI could be used to understand our wants and needs. Soon, AI-based customer service won't just assist humans — it will understand our feelings. With this information, companies can adjust their service to improve the customer experience. Consider how using AI to evaluate emotions could revolutionize in-person service. Can't find what you're looking for in a store? Sensors — such as microphones, cameras or facial scanners — can detect your frustration by analyzing your facial expressions and immediately ping a human or a robot to come help. Or, imagine you're antsy about a restaurant's slow service. At the table, a small AI-equipped computer with the same sensors could evaluate your facial expressions or voice, note your distress, and signal for another employee to come assist. If the computer tagged you as particularly angry, the restaurant could offer a free treat.   This type of AI will also transform shopping online. If you're scrolling through a website for the perfect outfit, for instance, your computer could use its forward-facing camera to pick up subtle facial cues — like furrowed eyebrows or slight pouts. The site could then use that information, combined with data from your previous browsing behavior, to offer you options you'll like.   As a data scientist working on refining machines' ability to detect human emotions, I know these seemingly futuristic technologies are well within reach. I'm currently developing a comprehensive machine-learning model that learns over time and could eventually make machines perform better than a typical store attendant or call center employee. That may seem hard to believe, but machines don't have common human vulnerabilities like being tired, hungry or overworked. My AI model will take into account different visual, audio and language cues simultaneously — like tone of voice, body language and rhetoric — to perform an in-depth analysis of people's emotional states. This data-driven insight could eventually lead to AI that could enable businesses to understand how a customer feels in different situations, even if they know very little about him or her.   The prospect of omnipresent AI scanning faces and listening to voices sounds intrusive. And companies will have to put rigorous security measures in place to protect customers' information. But overall, consumers will enjoy the kind of service AI will enable. Just look at the popularity of home assistant robots like Amazon's Alexa. A generation ago, the idea of allowing a machine to monitor our personal conversations would have seemed ludicrous. Now, it's commonplace. Allowing these same home assistant robots to interpret our visual cues is a logical next step.   History has shown that wariness of new technology fades as its benefits emerge. People constantly evaluate the emotions of customers, colleagues and loved ones to make decisions. Robots simply automate this process. And the more data they have, the better they will be at it.   Source:Edition-CNN Business Read more »
  • DeepCubeA AI Algorithm Solves Rubik's Cube Without Neural Network or Human Assistance
    A paper written by researchers at the University of California and published Monday in Nature Machine Intelligence outlines the development of DeepCubeA, a computer algorithm that can solve a Rubik's Cube without human assistance in under a minute. In a world first, researchers at the University of California have developed a computer algorithm that can solve a Rubik's Cube without a neural network, machine learning techniques, "specific domain knowledge," or human assistance. The algorithm, entitled DeepCubeA, solved 100% of the 1,000 test trials -- with each cube being scrambled between 1,000 and 10,000 times from the completion state -- using the smallest possible number of moves 60.3% of the time. In 36.4% of the trials, the algorithm solved the puzzle using just two moves more than the possible minimum. In addition to being able to solve a Rubik's Cube, the DeepCubeA algorithm successfully completed additional types of puzzles including various tile sliding puzzles, Lights Out, and Sokoban. These were finished far more frequently than the Rubik's Cube using the smallest possible number of moves. According to the scientists, "the generality of the core algorithm suggests that it may be applications beyond combinatorial puzzles, as problems with large state spaces and few goal states are not rare in planning, robotics and the natural sciences." By successfully being able to solve a Rubik's Cube without initially being trained on previous information, the DeepCubeA algorithm represents the gradual shift in machines from making carefully-directed computations to making those which appear to resemble human-like reasoning and decision-making. Source: News18 In Collaboration with HuntertechGlobal Read more »
  • How Artificial Intelligence is solving different business problems
    Tech Mahindra has implemented an AI-based facial recognition to register the attendance of employees, thereby reducing the work load of an HR associate As soon as the words “AI” and “music” are used in the same sentence, one comes across skepticism. If robots are making call centre jobs useless, one is scared to think of what would happen to all the musicans who are anyway underpaid. “In the world of personalisation and on-demand services, music is one of the very few remaining static artefacts,” says Ken Lythgoe, head of business development at creative AI technology company MXX, based in London, England. The company has created the world’s first AI tech that allows individual users to instantly edit music to fit their own video footage, complete with rises and fades.   According to Lythgoe, AI doesn’t need to be the enemy of music, and instead of replacing us, AI can empower us. MXX’s AI tech listens to music and creates a metadata based on its understanding of it. This data includes where it can edit in and out of sections, as well as what the sections might mean to a human, such as “building tension”, “climax”, “chorus” and “verse”. When the user provides a brief for the music they want, AI can edit the original track to fit the brief. Not only the UK, Japan, which is known for its brilliant technology had its beauty giant, Shiseido introduce its first subscription service recently with a mobile application offering personalised, high-tech skincare to consumers in Japan for about $92 per month. The service, called Optune, is among the industry’s first Internet of Things (IoT) systems to pair augmented reality (AR) and artificial intelligence (AI) with a serum and moisturiser dispenser. Using 80,000 skincare patterns, the application works with iPhones, collecting facial data from the built-in camera. Data is analysed with AI, taking into account personal and environmental skin conditions. Based on the result, a cartridge-loaded dispenser selects an appropriate formula for the user twice daily. More businesses are finding it difficult to trust the quality of existing user information and are looking to use artificial intelligence to clean up large pools of data to make business sense. For instance, when Swedish media group Bonnier AB faced challenges in adhering to GDPR (the European Union’s General Data Protection Regulation) for its 180 companies, a solution developed by Accenture brought together its diverse data sources. The company deployed machine learning and artificial intelligence for faster compliance for these sources which are largely processed manually. Businesses are now eyeing a data strategy independent of IT strategy to get “actionable insight”. According to Sanjeev Vohra, group technology officer and global data business lead at Accenture Technology, this has made the technology services leader take a different approach to solving business problems by “putting artificial intelligence to data and not data into the AI”. It’s not that the approach is full proof. Vohra noted that there have been cases of AI solutions, or bots, built using business data, fail. This proved that the existing data was “incomplete”. Accenture has been investing heavily in its innovation hub in Bengaluru during the past three years to use AI for making sense of data and clean large sets of user information, he said. “We have a big strategy on talent growth in data, as this is a hyper growth area for us globally. We will do this organically in India (we are already there in terms of skilling our talent) and in markets where we require certain complementary talent, we will go with inorganic growth,” Vohra said. For campus hires, Accenture has a “strong boot camp” to train people upfront to make them ready for jobs and the transformational work it focuses on. Accenture takes people in data strategy and architecture segments (one of the four broad segments) through its Data Master Architect programme, which has been co-created with the Massachusetts Institute of Technology to equip people with the right skills, he said. Company officials reckon that K2 is a perfect blend of knowledge and kindness. It will take over the routine HR transactions to provide constant assistance to the HR team in creating an enhanced employee experience. Tech Mahindra’s first HR humanoid K2 is a present-day, very functional Humanoid created by Tech Mahindra and deployed at its Noida Special Economic Zone Campus in Uttar Pradesh. K2 leverages Artificial Intelligence technology and initiates conversation without any need for wake-up commands. Keeping in mind the needs of specially abled, K2 can respond to queries with text display along with the speech. It can address general and specific HR-related employee queries as well as handle personal requests for actions like providing payslips, tax forms etc. Besides, it would enable the HR team to focus on other important areas for employee development. Tech Mahindra has already implemented an AI-based facial recognition system to register the attendance of employees, thereby drastically reducing the time spent by an associate in updating the timesheet. Recently, it also launched Talex – the world’s first AI-driven marketplace of talent that maps skills of the existing talent pool. Source: Financial Express Read more »
  • 5 Ways Singularity University is Exploring Artificial Intelligence at Global Summit This August
    This is an issue many of us contemplate as our lives become increasingly intertwined with technology. AI is impacting nearly every major industry, which is why Singularity University focuses on AI in its programs, including this August’s Global Summit in San Francisco. The SU team is working at warp speed to ensure this year’s Global Summit is the best one yet, and is convening some of the finest minds from an array of fields to guide an exploration of the light and dark sides of AI. The next wave of presenters and sessions is now available, and what follows below is a glimpse of what to expect and who you’ll hear from in this year’s program when it comes to AI.   AI 101 Singularity University Faculty, Nathana Sharma, kicks off the AI conversation at Global Summit with a rapid-fire introduction to the fundamentals of machine learning and artificial intelligence.  AI Cage Match   Machine learning and AI represent humankind’s greatest exponential leap forward—or a threat to our jobs and our very autonomy. Which is it? Hear from Neil Jacobstein (Chair of AI and Robotics at SU), Naveen Jain (Viome), and more.  Solving Today’s Problems with AI As these exponential technologies move beyond the lab and into the field, how do we know what kinds of problems next-generation software can solve? Come hear a variety of practitioners building breakthrough applications to focus on real-world challenges. Neeti Mehta (Automation Anywhere), Mike Capps (Diveplane), Dr. Vasco Pedro (Unbabel), and Nathana Sharma (SU Faculty for Blockchain, Policy, Law & Ethics) will participate in this panel discussion. AI – Hope, Hype, Reality   The vast majority of large companies say they’re deep into exploring machine learning and artificial intelligence—but few have a methodology to ensure successful initiatives. Gain insights about what AI can and can’t do, and how to implement AI programs that generate results. AI for Good How can AI and machine learning make the world a better place? Leila Toplic (NetHope) will lead the way in this session, exploring ways in which “smart” software is being used to have an impact on some of humanity’s deepest challenges. It’s hard to believe that Global Summit is only two months away! Time is running out, and you won’t want to miss this opportunity to hear the latest from leaders in AI and on disruptive innovations in AR/VR, future of work, impact investing, and more. Get your ticket to Global Summit today, before they’re all gone, and get ready to join the Singularity University community for an unforgettable experience in San Francisco. Source: Futurism Read more »
WordPress RSS Feed Retriever by Theme Mason

Author: hits1k

Leave a Reply