Here are the iOS frameworks Apple’s Catalyst brings to Mac
Catalyst is here, Project Marzipan never existed and we may soon see tens of thousands of iPad apps ported easily across to the Mac.
What is Catalyst?
Catalyst is Apple’s well thought through system that lets developers easily port their iPad apps across to the Mac. It consists of new tools within Xcode (essentially you just need to tick a box) and built-in Mac support for a huge number of APIs that will let your iOS apps run natively. I thought some readers may be interested to see a list of all of these.
AppKit for the Mac
AppKit is the framework that enables the Mac, while UIKit is what you use for iOS.
However, with over a million iPad apps Apple thinks could run on the Mac, the company is adding dozens of iOS frameworks and libraries to macOS in order to make it much easier to create an iPad app and deploy it to a Mac, starting this fall.
So, which UIKit elements will be supported on Macs? Nearly all of them – the list includes almost all Apple’s iOS APIs, bar the mobile-only ones. Here are all the UIKit API’s and libraries Apple will pump into AppKit to help kick-start iPad apps reaching Macs:
Introduced in 2017, Identity Lookup is a framework that allows developers to help users identify unwanted messages.
This is the software that lets you sign in just once on an iOS device in order to access your choice of video streaming services from cable/TV providers on the device.
An iOS 12 framework that empowers apps to understand natural language text. It enables things like tokenization and name, language & place detection in an app.
You can build apps that display iAds in a defined area of their user interface with this, which can be a nice little earner to developers.
Add sophisticated audio manipulation and processing capabilities to your app
Push user-facing notifications to the user’s device from a server , or generate them locally from your app.
Display information about users’ contacts in a graphical interface
OpenAL renders 3D sound quickly
Metal-leveraging framework for drawing shapes, particles, text, images, and video in two dimensions.
Core Data is a framework that you use to manage the model layer objects in your application.
Libraries that provide high-performance, energy-efficient computation on the CPU by leveraging its vector-processing capability.
Neural text to speech means Siri voice is more natural and has a better handling of cadence – it is very much more natural sounding the example proved. Though voice doesn’t photograph well 🙂 #wwdc pic.twitter.com/UatGTgCjzy
— jonny evans (@jonnyevans_cw) June 3, 2019
Access to calendar and reminders data so you can create, retrieve, and edit calendar items in your app
Core Foundation is a library framework that provides fundamental software services useful to application services, application environments, and to applications themselves.
Lets you connect external devices to apps.
A set of frameworks that let you establish user identity, secure data and ensure code validity.
Allows apps to customize and extend the core networking features of iOS and OS X.
This provides video compression and decompression, and conversion between raster image formatsby using hardware encoders on systems.
This is the code that providesan interface for viewing, selecting, and editing calendar events and reminders.
CoreText provides what you need to create text layouts and handle fonts in documents.
Want to look at a PDF? Want to let people interact with a PDF? That’s PDFKit.
An important API that coordinates the presentation of closed-captioned data with the relevant media files.
“Display map or satellite imagery from your app’s interface, call out points of interest, and determine placemark information for map coordinates,” Apple states.
— jonny evans (@jonnyevans_cw) June 3, 2019
What’s the model of this device?
Enable web views and services in an app.
This contains APIs developers can use in order to create user interfaces for audio units.
This sends specific types of notifications (Apple notes VoIP invites, watchOS complications and file provider notifications) to your app for processing.
You need this to serve up ads in an app, though you need to honor any opt-out instructions sensible users may have made.
Face and face landmark detection, text detection, barcode recognition, image registration, and general feature tracking
Part of MusicKit, this controls playback of a user’s media within an app.
Use this to let other apps access documents and directories stored and managed by your app.
Display thumbnail images and full-size previews of documents
This is kind of already available on both Macs and iOS, and enables apps to access a person’s contact information.
I’m guessing this may well end up being a widely used API. It provides tools and technologies for building games.
— jonny evans (@jonnyevans_cw) June 11, 2019
This provides access to user accounts stored in the Accounts database. Apple notes its use with Twitter here, but I imagine it will also be pretty handy to those using Sign on with Apple.
Support in-app purchases and interactions with the App Store.
Record or stream video from the screen, and audio from the app and microphone
Educational apps may use ClassKit to assign activities and view student progress within apps.
Add leaderboards, achievements, matchmaking, challenges, and other interactive features to games.
Authenticate users with TouchID or FaceID, or using their passcode.
Enterprise developers can use this to build access to Business Chat communications within their apps.
So important, I linked to its page.
Interfaces for audio content playback
Access network services and handle changes in network configurations
Where am I? That’s what CoreLocation figures out by finding the geographic location and orientation of a device.
“You might use this data to identify devices that have already taken advantage of a promotional offer that you provide, or to flag a device that you’ve determined to be fraudulent,” Apple states.
Add contextual actions.
This important code lets apps write and read most image formats.
You’ll need this to speak to other devices connected using the Lightning connector or Bluetooth.
Speech recognition inside apps.
Use this to develop audio and movie software.
Access and manage key operating system services, such as launch and identity services.
Request and process Apple Pay payments and distribute Wallet passes.
Search for and display photos.
Put app and user data in iCloud.
Want your app to be searchable? You’ll use CoreSpotlight.
Record audio, play it. Convert it between formats and more with AudioToolbox.
Hint is in the name.
Build apps that will process still and video images with Apple’s built in CoreImage tools.
You’ll use this to create 3D experiences.
Want devices using your app to find each other? That’s what this does.
I owe so much to CoreGraphics. It handles 2D rendering, antialiased rendering, gradients, images, color management, PDF documents and loads more.
Use this to share bitmaps/objects between apps. More.
I owe a lot to CoreAudio too. It generated a revolution in music production and is the digital audio infrastructure of iOS and OS X.
CoreVideo is what makes iMovie work, pretty much.
You’ll need this to make apps that speak to Bluetooth devices.
What on earth do those error codes mean?
Use this to efficiently process media samples and manage queues of media data
Send and receive data.
Use this so users don’t have to jump through hoops to authenticate.
Build Metal apps fast with this — it’s essential and you’ll need this info.
Work with time-based audiovisual media on iOS, macOS, watchOS and tvOS
A graphics rendering and animation infrastructure to animate visual elements of apps, aka CoreAnimation.
To create data that can be exchanged between your app and other apps.
Want to get the full power of the GPU? Use Metal.
NB: This list may be erroneous, poorly explained or incomplete. Some of these API’s were already available on both AppKit and UIKit, but I guess this has now become official. This list comes from a State of the Union WWDC video that is publicly available and shows up briefly at c.33.21.