New Developer Features in iOS 11

By Michelle Russell
16 Aug '17

The new Apple iOS 11 beta for developers has begun to be rolled out for web developers interested in creating apps for this new and smarter than ever iOS system. iOS 11 brings new and exciting features to its users including: machine learning with Core ML, Vision, and NLP; and system wide integration for iPad. Additionally, Apple is bringing augmented reality to everyone with ARKit for real world interaction.


These new features will allow developers to take their apps to the next level allowing for greater user interaction and overall fluidity. It is important to upgrade your app with every iOS update to provide the best possible experience for your app users. iOS app development for iPhone and iPad should always be executed by an efficient and knowledgeable team of developers to ensure success for your business.

ios, ios11

The Features

Drag and Drop

    Drag and Drop allows users to transfer content across apps easily. Users can now finally multi-select items instead of having to select items or apps individually. This new and exciting feature is available across all iOS devices and can even be adopted into your app with a simple and powerful API.


Machine Learning with Core ML

    Arguably the biggest reveal at WWDC 2017, Apple is embracing the future of machine learning on their devices. Core ML is a framework toolkit that makes it easy for you to put ML   models into your app. This API allows you to load a trained model and make predictions based off of it. Expect to see machine learning Core ML in many apps to come.


Machine Learning with Vision

    Vision brings computer vision to iOS. The supported features include face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, and image registration.


Machine Learning with NLP

    This Natural Language Processing (NLP) API supports features such as language identification, tokenization, lemmatization, part of speech, and named entity recognition.



    ARKit “takes apps beyond the screen, freeing them to interact with the real world in entirely new ways.” Put simply, ARKit brings augmented reality to iOS devices. ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. Through scene understanding and lighting estimation the iPhone and iPad can analyze the scene through the camera view and find horizontal planes (like tables and floors) as well as place objects on smaller surface areas as well. ARKit runs on Apple A9 and A10 processors, this allows developers to take advantage of optimizations for ARKit in Metal, SceneKit, and third-party tools like Unreal Engine and Unity.



Webilize is a leading web and app development company for small to medium sized businesses and Government organizations. Utilizing a great development team when building your iOS app is the best way to guarantee its success, and ease of use on all ends. Get in touch today. 


Share This:

Do you have the next idea for your project? Talk to Webilize