MACHINE LEARNING FOR APPS
Welcome to the most comprehensive course on Core ML, one of Apples hot new features for iOS 11. The goal with Machine Learning is to mimic the human mind. It can be used to identify things like objects or images, make predictions and even analyze and identify speech.
Dive in and learn the core concepts of machine learning and start building apps that can think! In this course you going to learn everything you need to know to start building more intelligent apps and your own ML Models.
WHY TAKE THIS COURSE?
Core ML is the first step if you want to start building apps with AI. Machine Learning opens an entirely new world to opportunities that will take your apps to the next level.
Here are some of the things you’ll be able to do after taking this course:
- Learn to code how the PROs code – not just copy and paste
- Build Real Projects – You’ll get to build projects that help you retain what you’ve learned
- Build awesome apps that can make predictions
- Build amazing apps that can classify human handwriting
WHAT YOU WILL LEARN:
- Learn about the foundation of Machine Learning and Core ML
- Learn foundational python
- Build a classification model allow your apps to make predictions
- Build a neural network for your app that can classify human writing
- Learn core ML concepts so you can build your own ML Model
- Utilize the power of Machine Learning and AI for use in iOS apps
- Learn how to pass in images to Apples pre trained model – MobileNet
Don’t forget to join the free live community where you can get free help anytime from other students
Intro to Course
In this lesson, you will learn the basics of Machine Learning in general – what it is and why developers care.
In this lesson, you will learn the 5 main steps in Machine Learning and how we will utilize them in this course.
In this lesson, you will install Anaconda – an application that makes creating and switching between Python environments seamless on your Mac.
In this lesson, you will download and configure Atom – a fully hackable text editor we will use to write Python code in the following section.
Python Basics
In this lesson, you will learn how to create and work with variables in Python.
In this lesson, you will learn how to write and use functions, conditionals, and loops in Python.
In this lesson, you will learn how to create and use arrays and tuples in Python.
In this lesson, you will learn how to import modules (think frameworks) in Python to grant access to additional functionality.
Building a Classification Model
In this lesson, you will become familiar with scikit-learn – a popular machine learning module in Python. You will learn what it is and why you should using it.
In this lesson, you will install scikit-learn and scipy with Anaconda. Scipy is a framework for using Scientific Python.
In this lesson, you will be introduced to the Iris Dataset – a famous set of data used to classify three types of Iris flower.
In this lesson, you will be given a definition and examples of Features & Labels – the two most important pieces of data required to train a machine learning model.
In this lesson, you will load the Iris dataset into your Python project, examine the data, and make the necessary preparations for the data to be used for model training.
In this lesson, you will learn about KNeighborsClassifier, create an instance of it, and train it with our array of training data.
In this lesson, you will test the accuracy of the Classification model using test data.
In this video, you will build your own KNeighborsClassifier class from scratch to understand how it works under the hood.
Building a Convolutional Neural Network
In this lesson, you will be introduced to Keras – a robust, fully-featured Machine Learning framework you will use to create a neural network capable of classifying human handwriting.
In this lesson, you will learn about Convolutional Neural Networks (CNNs), how they work, and how we will use them.
In this lesson, you will install Keras using Anaconda then import it into your project.
In this lesson, you will learn what is needed to prepare data to enter a CNN.
In part 1 of this lesson, you will build and visualize a CNN in code and by observing diagrams.
In part 2 of this lesson, you will build and visualize a CNN in code and by observing diagrams.
In this lesson, you will train your CNN, evaluate it's accuracy, and save the compiled model to your local disk.
In this lesson, you will learn how to use Anaconda to switch Python environments and convert your Keras model into a Core ML model for use in Xcode.
Building a Handwriting Recognition App
In this video, you will be introduced to the handwriting analysis app you'll build using your hand-rolled Core ML model.
In this lesson, you will build the interface of your app in Interface Builder and wire up the required @IBOutlets/Actions.
In this lesson, you will use the UITouch delegate methods to handle drawing on the screen.
In this lesson, you will import your Core ML model and read through the metadata to ensure that everything was created as expected.
In this lesson, you will utilize Core ML and Vision to make a prediction based on input sent in from a drawing on the screen.
In this lesson, you will process results returned from our Core ML request handler and write a function to convert the greatest value in our array of possible values into a presented digit on the screen.
Core ML Basics
In this video, you will be introduced to the Core ML app you'll build in this Target Topic. It's an amazing photo analysis app that uses machine learning to identify images with a certain level of confidence.
In this lesson, you will learn the basics of Machine Learning in general – what it is and why developers care.
In this lesson, you will learn about Core ML – Apple's Machine Learning framework.
In this lesson, you will create the Xcode project needed to build the Core ML photo analysis app.
In this lesson, you will build out ImageVC in Interface Builder and connect the required @IBOutlets to certain UI elements.
In this lesson, you will build ImageCell – the UICollectionViewCell that will hold an image for our Core ML model to analyze later on in this course. You will create the code subclass as well and link up any needed @IBOutlets.
In this lesson, you will create a helper file containing instance of UIImage with our imported image files. This static data will be used to populate the UICollectionView in ImageVC.
In this lesson, you will create a custom UICollectionViewFlowLayout which will be used to set the UICollectionView to show a nice 3 column grid of square images.
In this lesson, you will visit developer.apple.com to select and download a pre-trained Core ML model for use in your project. You will learn how to import it successfully into Xcode and set it up for use as an ordinary Swift class.
In this lesson, you will learn how to pass images through a Core ML model using Core ML, Vision (an image-specific ML framework), and a series of requests, handlers, and results.
In this lesson, we will elegantly present the data returned by the Core ML model in the UILabel in the ImageVC interface when an ImageCell is selected.
In this video, you will be challenged to take what you've learned and add an extra feature to this Core ML powered app.