, today we will look at the AI-related technologies or products mentioned on Day 2.
Assisted programming
Code Completion Swift Assist
Hardware foundation
On-device machine learning (On-device ML) mainly relies on Apple's hardware support:
Unified memory (unified memory), machine learning accelerators in the CPU, GPU, and Neural Engine, which build efficient and low-latency inference capabilities.
Vision
Text extraction Face detection Body pose recognition
Translation
Simple UI Flexible APIs Swift Assist Efficient batching
CreateML
Training models with your own data
Object tracking Data source exploration Time series models
Running models on the device
Various models downloaded from HuggingFace can run locally: Whisper, Stable Diffusion, Mistral, LLama, Falcon, CLIP, Qwen, OpenELM.
Training (can be trained on macOS) Preparation (using Core ML tools) Integration (using Core ML)
Research contributions
MLX
MLX is an array framework similar to NumPy, designed for efficient and flexible machine learning on Apple Silicon, introduced by Apple’s machine learning research team.
CoreNet
CoreNet is a deep neural network toolkit that allows researchers and engineers to train standard and novel small-scale and large-scale models for various tasks, including foundational models (such as CLIP and LLMs), object classification, object detection, and semantic segmentation.
OpenELM
OpenELM is a family of efficient language models with an open training and inference framework. OpenELM uses a hierarchical scaling strategy to efficiently distribute parameters in each layer of Transformer models, thereby improving accuracy. For example, under a budget of approximately one billion parameters, OpenELM improves accuracy by 2.36% compared to OLMo while requiring half the amount of pretraining data.
More research contributions can be viewed at https://machinelearning.apple.com/.