We live in a fast-moving, complex world of increasingly connected people and connected things that are creating vast new digital footprints. Digital technologies, data, AI and emerging collaborative platforms are the key enablers of the 4th industrial revolution.
“We stand on the brink of a technological revolution that will fundamentally alter the way we live, work, and relate to one another. In its scale, scope, and complexity, the transformation will be unlike anything humankind has experienced before. We do not yet know just how it will unfold, but one thing is clear: the response to it must be integrated and comprehensive, involving all stakeholders of the global polity, from the public and private sectors to academia and civil society.”
– Klaus Schwab, Founder and Executive Chairman of the World Economic Forum
Digital technologies are changing the way people within organizations work together. It is creating a culture in which business and technology leaders must join forces to realize value from all data. Insights from big data can enable all employees to make better decisions—deepening customer engagement, optimizing operations, preventing threats and fraud, and capitalizing on new sources of revenue. But escalating demand for insights requires a fundamentally new approach to architecture, tools and practices.
In this massively connect world data and intelligent platforms can provide organizations with the foundation for every decision. However, there are big challenges in making use of so much information.
– Too much data creates an information overload.
– Organizing and storing all of this data can be problematic.
– Organizations don’t know how to use all of this data to create insight.
To organize and make sense of data, executives must address these challenges as leadership issues not technical issues. To thrive, organizations need to process all the data available to provide specific insights and intelligent predictions that drives better business. This requires strategic understanding of the potential of data science, cost effective management of data and platforms combined with the skills to deliver the right type of insight in the right way.
Using data science and machine learning capabilities organizations now can massively improve marketing, create operational efficiency, build new business models, disrupt the competitive status quo of industry and spark innovation. In order to archive these disruptive benefits business leaders and decision makers need to have in-depth knowledge of how to get better value from data and transform their information landscape in a way that delivers continual value.
Dispelling the myth that “business leaders don’t need to be technical”
“We expect our executives to have a strong understanding of the financial performance of their companies. Shareholders would find it strange – or more likely, unacceptable – if a CEO said, “I’m not financially-inclined” and passed along financial performance inquires to his or her CFO. Similarly, CEOs in an increasingly digital world will struggle to say, “I’m not technical” and hand over mission-critical business questions for the engineers to answer.”
No matter your industry, or the type of organization you work for, your world is driven by data. To deal with this data driven world, business and technology professionals must be equipped to use data as a strategic resource.
Data Science Middle East (DSME) in partnership with EVERATI is excited to announce a 2-day Executive Masterclass in Dubai to learn about practical strategies, models, and techniques for building up a smart and intelligent enterprise.
By attending this masterclass, you will be able to contribute in the design of data strategy, ignite initiatives leveraging data as a strategic asset and also equip yourself to lead an analytics team to success.
This 2-day executive masterclass will help you address questions like do I really need big data and advanced analytics capabilities in my organization –how will it help the business? What are the benefits of having a big data capability, and the risks of not having one? What can business expect as outcomes, and plus we will explore several use cases to show real value and impact of data science, machine intelligence and cloud technologies.
“There is a pressing need for more business people who can think quantitatively and make decisions based on data and analysis, and business people who can do so will become easily valuable.”
– Tom Davenport
Goals and Objectives of Masterclass
Big Data and Advanced Analytics Masterclass will help business and technology leaders in understanding and learning about commercial potential of data, applications and uses cases of data science, business focused applied know-how of cloud platforms and machine learning as well as big data ecosystem. In addition to this, this masterclass will help you in addressing questions such as; whether the big data and advanced analytics capabilities are needed in my organizations and how it can be useful in increasing my business. Another question that can be talk about could be related to the benefits of having big data technologies and what can be its risks and much more.By the end of the masterclass participants will be able to:
– Explain Big Data, analytics and the Big Data ecosystem.
– Apply appropriate analytics tools and techniques to collect and analyse Big Data, and identify insights that can lead to actionable results.
– Leverage advanced analytics to create an organizational competitive advantage.
– Deploy the data analytics lifecycle to address Big Data analytics projects.
– Define machine learning, why it matters, and discuss its relationship to analytics, data science, and big data.
– Machine learning fundamentals, the importance of algorithms, and machine learning as a service.
– Highlight the key benefits that are being achieved with big data and how to identify, scope and secure support for a suitable pilot project.
Economist Intelligence Unit survey shows that currently, around 58% organizations want to make the greatest investment in digital transformation through big data and analytics. For achieving this, the master class is surely the best option and selection for them.
Who Should Attend Masterclass
Big Data and Advanced Analytics Masterclass is perfect for company’s CXO’s, CTOs, CIOs, and CSOs. It is ideal for company’s directors, nontechnical executives, and entrepreneurs as well as for business managers, business performance, analytics, customer services, marketing, finance, human resources and sales roles so as to achieve the highest standard of performance and perfection.
Attend this executive workshop to learn more about modern applications for big data, advanced analytics and machine learning, including recommendation systems, streaming analytics, deep learning and AI. And learn from the experiences of each other and senior data and analytics leaders that have successfully navigated both organizational and technological challenges to adopt big data strategies and helped many organizations embark on their intelligent enterprise evolution.
Masterclass starts on 16-17 August 2016 in Dubai. You can register online by visiting www.bigdatamasterclass.me. Registration can be also done by emailing Nitesh Marwaha at email@example.com or by calling us on +971 55 875 2588. For group discounts and corporate booking, you can email your requirements to firstname.lastname@example.org
Let’s join hands to make a progressive and prosperous digitized world with the expertise and skill set needed to move forward. Be a part of this masterclass and practically learn how to accelerate the ROI of your big data and data science initiatives.read more
Machine learning (ML) is the unsung hero that powers many applications, systems, sensors, devices, and products. Machine learning is so pervasive that we can often assume its presence in most of the applications and systems without having to specifically call it out.
In simple terms, machine learning is a computer’s ability to learn from data, and it is one of the most useful tools we have to develop intelligent systems and applications. Machine learning is used widely today for all kinds of tasks, from churn prediction in large companies, to web search, to medical diagnostics, to robotics. It’s hard to find a field that cannot benefit from machine learning in one way or another.
Machine learning’s intuitive, versatile and robust approach to finding patterns in the available data makes it a priceless asset for anyone who wants to turn data into insights and predictions. What’s more, today it is more accessible than ever before, thanks to the variety of open source tools and programming languages.
What developers actually need to know about Machine Learning
Something is wrong in the way ML is being taught to developers.
Most ML teachers like to explain how different learning algorithms work and spend tons of time on that. For a beginner who wants to start using ML, being able to choose an algorithm and set parameters looks like the #1 barrier to entry, and knowing how the different techniques work seems to be a key requirement to remove that barrier. Many practitioners argue however that you only need one technique to get started: random forests. Other techniques may sometimes outperform them, but in general, random forests are the most likely to perform best on a variety of problems (see Do we Need Hundreds of Classifiers to Solve Real World Classification Problems?), which makes them more than enough for a developer just getting started with ML.
We would further argue that you don’t need to know all the inner workings of (random forest) learning algorithms (and the simpler decision tree learning algorithms that they use). A high-level understanding of the algorithms, the intuitions behind them, their main parameters, their possibilities and limitations is enough. You’ll know enough to start practicing and experimenting with ML, as there are great open source ML libraries (such as scikit-learn in Python) and cloud platforms that make it super easy to create predictive models from data.
So, if we just give an overview of only one technique, what else can we teach?
Deploying ML models into production
It turns out that, when using ML in real-world applications, most of the work takes place before and after the learning. ML instructors rarely provide an end-to-end view of what it takes to use ML in a predictive application that’s deployed in production. They just explain one part of the problem, then they assume you’ll figure out the rest and you’ll connect the dots on your own. Like for instance, connecting the dots between the ML libraries you were taught to use in Python, R, or Matlab, and your application in production which is developed in Ruby, Swift, C++, etc.
Fortunately, today there are new and accessible solutions to this “last-mile problem”. They revolve around the use of REST (http) APIs. Models need to be exposed as APIs, and if scaling the number of predictions performed by a given model can be an issue, these APIs would be served on multiple endpoints with load balancers in front. Platforms-as-a service can help for that—here is some info about Microsoft Azure ML’s scaling capabilities, Amazon ML’s, and Yhat’s Analytics Load Balancer (which you can also run on your own private infrastructure/cloud). Some of these platforms allow you to use whatever ML library you want, others restrict you to their own proprietary ones. In our upcoming workshop, We’ve chosen to use Azure to deploy models created with scikit-learn into APIs, and also to demonstrate how Amazon and BigML provide an even higher level of abstraction (while still providing accurate models) that can make them easier to work with in many cases.
Deployment is not the only post-learning challenge there is in real-world ML. You should also find appropriate ways to evaluate and monitor your models’ performance/impact, before and after deployment.
The ML workflow diagram above also presents some of the steps to take before learning a model, which are about preparing the right dataset for the algorithms to run on. Before actually running any algorithm you need to…
– Define the right ML problem to tackle for your organization
– Engineer features, i.e. find ways to represent the objects on which you’ll be making predictions with ML
– Figure out when/how often you’ll need to make predictions, and how much time you’ll have for that (is there a way to do predictions in batches or do you absolutely need all your predictions to be real-time?)
– Collect data
– Prepare the actual dataset to run learning algorithms on, i.e. extract features from the “raw” collected data and clean it
– Figure out when/how often you’ll need to learn new/updated models, and how much time you’ll have for that.
Operational Machine Learning Workshop for Developers
We are excited to officially announce today that, in collaboration with PAPIS, we are launching a new learning track called PAPIs Workshops. In partnership with leading education and industry organizations we will offer practical and industry-focused learning programs in various locations around the world (starting with Madrid, London and Boston).
Most Machine Learning courses are given from the perspective of a Data Scientist and focus on the techniques and algorithms that allow to learn from data. This workshop takes the perspective of an application developer and instead provides an end-to-end view of ML integration into your applications. We’ll go all the way from data preparation to the integration of predictive models in your domain and their deployment in production.
Our first workshop is aimed at developers and is an agnostic introduction to operational Machine Learning with open source and cloud platforms. It’s a 2-day hands-on workshop given in a classroom setting. Day 1 covers an intro to ML, the creation, operationalization, and evaluation of predictive models. Day 2 features model selection, ensembles, data preparation, a practical overview of advanced topics such as unsupervised learning and deep learning, and methodology for developing your own ML use case.
We’re using Python with libraries such as Pandas, scikit-learn, SKLL, and cloud platforms such as Microsoft Azure ML, Amazon ML, BigML and Indico. I think these platforms are great for many organizations and real-world use cases, but even if for some reason you’d realize they may not be the perfect fit for you, I’d still recommend using them for learning and practicing ML. ML-as-a-Service makes it much quicker to setup work environments (e.g. Azure ML has most popular libraries preinstalled and can run interactive Jupyter notebooks, which you can access from your browser) but also to experiment with ML with the higher levels of abstraction they provide (e.g. combining one-click clustering, anomaly detection, and classification models with BigML, or quickly featurizing text and images with Indico’s Deep Learning API).
For more information on how to attend, participate or become a sponsor, please visit http://www.papis.io/workshops/operational-machine-learning
For more information on how to attend, participate or become a sponsor, please visit http://www.papis.io/workshops/operational-machine-learning
Special Offer – 30% Discount!
Please take a moment to register now and avail the special 30% discount offered. Visit the event page and register before 22nd May to get 30% off. If you have any questions about the workshop or registration please feel free to contact us at email@example.com.
Happy Machine Learning!
Dr. Louis Dorard and Ali Syed
In the last two years the world has produced more data than in all of human history. All this data and digital technologies powered by machine intelligence are changing the way everyone operates. For businesses and government services to compete in a connected economy they need to be able to make sense of all the valuable information that is being generated to unleash new waves of productivity, growth and innovation.
We’ve started Data Science Middle East (DSME) Foundation with the vision to create a regional collaboration on digital skills and data talent development. DSME brings together business and technology professionals, researchers, experts, practitioners, and industry leaders to promote digital data literacy, research and innovation through open projects, capacity building, and community engagements.
This initiative marks a big step forward in uniting the worlds of business leadership, digital transformation, data science and new talent development. With the support from the local governments, industry organizations, communities and academic institutions, DSME will offer variety of data education opportunities and digital workforce skills development projects.
Data science isn’t just for data scientists. In massively connected data driven world, it is imperative that the workforce of today and tomorrow is able to understand what data is available and use scientific methods to analyze and interpret it. DSME is here to help you learn and apply the art and science of turning data into meaningful insights and intelligent predictions.
This launch is an invitation to industry, and academia to let you know that the DSME is open for business. We are looking forward to working with all of you to put data at the core of economic development and innovation with the aim of building a sustainable Middle East. If you are interested in collaborating with DSME or have exciting ideas to develop digital data talent, please contact us on firstname.lastname@example.org
We (DSME and Everati team) are excited to confirm that Middle East’s first professional workshop to learn the most demanded skills of 2016 will be in Dubai (UAE) on 25-27 April, 2016. Amazing! Great opportunity to practically learn data science and machine learning to advance your knowledge and career. Attend this workshop to learn the fundamentals of data science and machine learning and leave armed with practical skills to extract value from data.
Everati, our regional partner for the data science workshops, are a UAE based organization dedicated to the provision of global business information through premier, high-profile, specialized events. For more information on how to attend, participate or become a sponsor, please visit www.datasciencetraining.me
As data becomes more influential in shaping the work and strategies of so many industries, we want to be prepared to raise a generation with the skills and knowledge to work with data. For collaborations, partnership and joint industry programs, get in touch at email@example.com or drop me a line directly. We would love to work with you to enable digital and data-driven Middle East.
Ali Syedread more
Guest post by Dr. Mike Ashcroft
If you know all there is to know about Data Science and Machine Learning, you may want to stop reading now.
Oh good, we’re alone.
Despite the hype and confusion surrounding Data Science, the need for people who can interpret data and use it to find patterns and predictions to help organizations make informed business decisions is very real. Data Science is fueling the digital economy, we need to move it to the very center of our business, research and social change endeavours. Data Science is bringing new levels of speed, relevance, and precision to the way we design and manage businesses and operating models. Machine Learning is without a doubt the core aspect of Data Science and predicative analytics in general. In health care, Machine Learning is changing the way doctors identify people at risk of developing certain diseases; in retail, machine learning is used to analyze purchasing data to anticipate trends; CRM and marketing experts use it to tailor campaigns and offers.
Machine Learning is simple: We have the algorithms, we have the experience and, these days, we have the data. The ‘complicated’ mathematics behind the data revolution is the process of the cumulative application of basic techniques that can be understood in terms of mathematics we learnt in high school and early college. Providing this understanding is the most important facet of Data Science education: Only those who understand the tools they use are able to choose the appropriate technique for the tasks they face. I have never met an organization prepared to trust their data analysis to analysts who cannot explain why they use the techniques they do.
Fundamentals of Machine Learning bootcamp is designed to give you this understanding. It will provide you with the ability to apply the most powerful techniques in Machine Learning, to select appropriate techniques for particular problems , and to say exactly what these techniques do and why they work in a way that is understandable to data analysis stakeholders.
Fundamentals of Machine Learning bootcamp will take you through the conceptual and applied foundations of the subject. Topics covered will include Machine Learning theory, types of learning, techniques, models and methods. Labs are developed to practically learn how to use the R programming language and packages for applying the main concepts and techniques of Machine Learning. In this bootcamp, our goal is to give you the basic skills that you need to understand Machine Learning algorithms and models, and interpret their output, which is important for solving a range of data science problems. This is an applied Machine Learning course, and we focus on the intuitions and practical know-how needed to get Machine Learning algorithms to work in practice, rather than the mathematical equations and derivatives.
Using actual data, the bootcamp begins by reviewing important basic statistical methods. You will learn to use the popular statistical programming language R to build these simple models from the ground up. You will then see how these simple techniques can be improved, combined, augmented and adjusted to produce powerful statistical tools for different tasks in data analysis. In this way, you will learn to see advanced Machine Learning techniques not as black boxes, but as principled techniques used to unlock patterns from data.
Over the course of five days, over two dozen techniques will be examined, implemented through supervised exercises and tutorials, and compared. You will learn the relative advantages and disadvantages of different types of techniques in different contexts. You will see how some models are entirely data driven, while others can be used to encode defeasible expert knowledge. You will learn methods for validating selected models and techniques and for choosing among alternative methods.
As we proceed we discuss with examples the sorts of data that suit these different approaches, and you will continue to apply these techniques ‘live’ in R. All topic areas have practical exercises, where you implement the algorithms we are looking at, as well as analyze their outputs and their suitability to particular problems. It is an essential part of the course aims that you get real ‘hands on’ experience working with the techniques we cover, in the comfortable environment of a classroom where you can discuss and work through problems you encounter with the instructor (me). The purpose is to arm you with a set of tools that you know how to apply, how to explain and when to use, as well as their theoretical background.
Fundamentals of Machine Learning bootcamp is for students, researchers and professionals from industry, services, social and public sectors who wish to develop the ability to turn data into meaningful and actionable insights. The greatest care is taken to provide bootcamp participants with high quality instruction that makes the journey of understanding and using advanced data analysis tools as easy and enjoyable as possible. So join us, master the science behind ‘data science’ and equip yourself for a role in the data revolution.
Read and download the bootcamp brochure.
Special Offer – 25% Discount!
Please take a moment to register now and avail the special 25% discount offered. Visit the event page and use the promo code FMLB100 to get 25% off. We are encouraging University students (post graduate and PhD) and researchers to learn machine learning by offering a special 50% discount for them. Also, 40% discount for Limited seats, I encourage you to register as soon possible.
About the Author
The author is Dr. Mike Ashcroft, Lecturer in Machine Learning and Artificial Intelligence at Uppsala University in Sweden, and founder of data analytics company Inatas AB. He has worked in the Machine Learning field for over five years, developing cutting edge software, providing professional and university courses and performing specialist consulting work. He has extensive experience teaching and working both in Europe and Asia.read more
Guest post by Louis Dorard
When I started writing about the tools that make it easier to create and deploy predictive models in your business or in your app, I focused on two of them: Google Prediction API and BigML. There are many more great tools actually in this space (some of which only came out this year) Datagami, Dataiku, Indico, Intuitics, Graphlab, Openscoring, PredicionIO, Rapidminer, Yhat…
What better way to learn how to use them than from the very people who made them, through hands-on sessions illustrated with concrete case studies? Well, that’s what’s waiting for you at PAPIs.io on 17 and 18 November in Barcelona, Spain — right before Strata. There will be tutorials on the first day, which will close with Jeroen Janssens, author of Data Science at the Command Line, who’ll give an exclusive tutorial on using several of these tools to make predictions from the command line. At the end of the day you’ll be ready to use them at our (free) Hack Night fuelled by code, beer and patatas bravas!
The second day will be dedicated to expert perspectives from companies that are deploying a wide array of predictive apps. I feel that one of the key things that are needed by people who are curious about Predictive is to see what others are doing, what value they’re creating, and how. All the sessions that have been selected by the Program Committee are very practical — no BS, marketing speeches or buzzword cover-ups. They fit within the following 4 categories: showcases of innovative real-world uses of Machine Learning (“Showcases“), challenges in creating and deploying predictive APIs and apps (“Challenges”), stories of non-data scientists who created predictive apps (”Predictive For All”), and demos of tools applied to concrete predictive use cases (“Tools In Action”).
PAPIs.io is also a place where you’ll hear thought leaders share their vision, with a keynote by Vijay Narayanan, Microsoft Director of Machine Learning and Data Solutions, a keynote by Andy Thurai, IBM Program Director for API, IoT and Connected Cloud, and a panel discussion on the future of predictive APIs.
Check out the full schedule and list of speakers on Lanyrd
If you’re interested in attending the conference, I’ve come up with an exclusive offer for you that will only be valid for 24 hours: 40% off the regular registration price (75€ instead of 125€) plus a free copy of Bootstrapping Machine Learning in PDF/ePub/Mobi formats (worth $39) if you don’t already own it. For this, register for PAPIs using discount code BMLPAPIS and send your email receipt to firstname.lastname@example.org in order to get your free copy of Bootstrapping Machine Learning.
Hoping to meet you in person next month in Barcelona!
By Ali Syed
Throughout the world, organizations and leaders widely acknowledge that digital technologies are transformational. Most of them understand the point that we are not living in the digital revolution. We are living and breathing the history that many will look back at in generations to come and see this time in history as one that fundamentally changed everything. Forever. The society is changing, technology is changing, business models are changing, the way we interact and connect with each other is changing and it’s no longer about becoming digital or not, it’s about sustaining, succeeding and competing in a digital era.
In this digital era organizations can no longer operate, sustain and compete using traditional industrial age thinking, systems, models, technologies and operating frameworks. Digital technologies are irrevocably changing the way organizations and institutions engage and interact with people, citizens, consumers and many other stakeholders. Traditional thinking, operating models, and value delivery channels are being disrupted, driving leaders and executives to reassess their strategies and plans. The combination of dealing with the complexities of the volatile digital world, data deluge, and the pressing need to stay competitive and relevant has sharpened the focus on using digital technologies to transform, in order to stay relevant and competitive.
“Digital technologies are assuming an increasingly prominent place in everyday life, both in the more traditional areas and in the field of new information and communication technologies. Digital is the common language for information, whether in the form of text, pictures or video images. Digitisation, ie the conversion of information into a string of 0s and 1s, provides a common denominator for telephone, television, radio, camera, camcorder or computer signals.”- Information society and a digital world. Report by Council of Europe, Committee on Science and Technology
Data as a fabric of the digital age underpins everything we do. It’s part and parcel of our digital existence. To succeed in a digital world we must master the art and science of managing, leveraging and applying data. Organizations need to become connected and data driven. To sustain and remain competitive they should know what is happening now, what is likely to happen next, and what actions should be taken to get the optimal results. The buzz word associated with this is ‘Big Data’. The trick however is to ignore the Big and just focus on the Data.
Data has become the new raw material: an economic input almost on a par with capital and labor. Organizations need data from multiple systems to make decisions. They need information in easy to read and consistent format to enable fast understanding and response. The path to sustainable and meaningful advantage is being able to find new ways of managing data, discovering what’s in it, finding patterns and predictions, and deciding what to do with all that. Fuelled by data deluge, predictive models and machine learning programs are being used to improve everything around us from the way we shop to the web experiences we enjoy, and the way we receive social and health care.
At the core of any digital transformation is the ability to think creatively and differently around digital technologies, based on the right mix of people, architectures, systems, processes, frameworks and experience around collaboration. This involves creating a mind-set change. One of the areas where we have to do it on immediate basis is how organizations manage, process, and analyze data. Within organizations in every industry, in every part of the world, business and technology leaders are assessing how to get true value from the monolithic amounts of data they already have within and outside their organizations. At the same time, new technologies, comprising of sensors and devices are collecting more data than ever before.
One of the exciting changes in predictive analytics and machine learning during the last 3 years has been the growth of predictive APIs, applications and machine learning as a service (MLaaS) for analyzing and predicting from data. This is an emerging domain. Machine learning platforms of various sorts are revolutionizing many areas of business, public and social services, and predictive APIs (PAPIs) have the potential to bring these capabilities to an even wider range of applications.
Unless you’re a nerd or a developer, you’ve probably never paid much attention to the term “API” — an acronym for “Application Programming Interface.” However, if you’re obsessive compulsive user of social media platforms, you’ve most likely used an application or service built using an API. Twitter, Facebook, LinkedIn, WordPress, WhatsApp, Uber, Amazon, Airbnb and thousands of other applications rely on APIs. What’s more, without APIs, Apple App Store and the Android Marketplace would be very small!
APIs are carefully thought out pieces of code created by programmers for their applications that allow other applications to interact with their application and platform. APIs matter to all organizations operating in the digital era because using them, they can develop platforms, applications and experiences that help us do our jobs effectively and optimally, market products and ideas better, drive revenues, and connect with consumers, customers and partners. Many companies have realized the opportunities that APIs offer and have launched their own platforms and applications to deliver products and services. The popularity of APIs isn’t limited to social media. APIs are strategic tools to unlock business value. Check out how extensive APIs are by reviewing the API directory found at ProgrammableWeb.
Just like conventional APIs are making it easy for programmers to create applications, similarly Predictive APIs are making machine learning simple and accessible to everyone. This type of APIs are making it easier to apply machine learning to data — and thus to create Predictive Apps. In essence, these APIs abstract away some of the complexities of creating and deploying machine learning models and make machine learning more accessible to developers. They also allow them to spend more time on user experience, design, data munging, experimenting and delivering value from data.
In simple terms, machine learning is a computer’s ability to learn from data, and it is one of the most useful tools we have to develop intelligent systems and applications. Machine learning is used widely today for all kinds of tasks, from churn prediction in large companies, to web search, to medical diagnostics, to robotics. It’s hard to find a field that cannot benefit from machine learning in one way or another. Predictive analytics and machine learning are bringing new levels of speed, relevance, and precision to the way we design and manage operating models. In health care, machine learning is changing the way doctors identify people at risk of developing certain diseases; in retail, machine learning is used to analyze purchasing data to anticipate trends; CRM and marketing experts use it to tailor campaigns and offers.
Machine learning is fun once you know what it is and how to use it. Predictive APIs provides great opportunity for all of us to use this super awesome capability with just enough Math to make awesome applications. Organizations can analyze data and predict future outcomes with Predictive APIs. They have the opportunity to use these APIs to build smart and intelligent apps using machine learning algorithms.
“Machine learning, predictive analytics and APIs for that matter are not technologies of the future, but important technologies of the present.” – Janet Wagner, Machine learning and predictive analytics foster growth
Read this post to understand the business possibilities enabled by Predictive APIs. To learn how to use Predictive APIs and make machine learning work for you I highly recommend reading Louis Dorard’s book “Bootstrapping Machine Learning”.
Who do you believe is the #1 at the Kaggle Rankings?
A) A professor from Stanford with 20 years experience in machine learning.
B) A Russian mathematician who solved college level math puzzles at the age of 3 and works for the KGB.
C) A Spaniard from Andalusia (where my hometown is and where everybody naps twice a day) who works for a hospital.
D) Chuck Norris
Of course answer was C. The point he was trying to make was that we don’t need machine learning gurus and academics instead we need experts and professionals from other fields who know enough about machine learning to use it.
Since then we have been collaborating on many initiatives related to promoting machine learning and making it more accessible and simple. Earlier this year, Francisco introduced me to Louis Dorard who is actively helping people to exploit the power of machine learning with minimal coding experience using Predictive APIs. In the last 6 months or so three of us discussed the need for a community and platform to bring together practitioners from industry, academia and public services to present new developments, identify new needs and trends, and discuss the challenges of building real-world predictive APIs and applications. Last month we announced that PAPIs 2014 – The First International Conference on Predictive Application and APIs will be held in Barcelona on November 17-18. A technical and practical conference dedicated to Predictive APIs and Predictive Apps
PAPIs ‘14 is the first International Conference on Predictive APIs and Apps. It will take place on 17-18 November 2014 (right before the O’Reilly Strata Conference) in Barcelona, Spain, where it will connect those who make Predictive APIs with those who use them to make Predictive Apps.
We want PAPIs to become an open forum for technologists, researchers and developers of real-world predictive APIs and applications to get together to learn and discuss new machine learning APIs, techniques, architectures, and tools to build predictive applications.
With the barrier of entry for machine learning effectively removed by predictive APIs, the time is now for all of us (programmers, researchers, business and technology professionals) to take advantage of machine learning to deliver real and meaningful economic and social value. Just remember, data alone is not enough. We need predictive APIs and applications to make it valuable, actionable and meaningful.
See you all at PAPIs in Barcelona!read more