Young Consulting2023-09-02T20:54:21+00:00http://davidgyoungtech.comDavid G. Youngtech@davidgyoung.comBackground Launching2023-06-30T00:00:00+00:00http://davidgyoungtech.com/2023/06/30/background-launching<p>The most common way to start a mobile app is to tap on its icon on your phone. The app’s user interface springs to life, and you can interact with it for awhile before moving on to other things. Once the app is not visible, it is considered to be in the background. On both iOS and Android, apps in the background may continue running (with some restrictions.)</p>
<p>But it is also possible to launch an app automatically, going directly to the background without the app ever appearing on the screen. Understanding the details of how this works is important for developers building apps that need to run passively in the background. This is especially true because both iOS and Android can kill your backgrounded app at any time if memory or battery are low. Even if you don’t care to launch an app automatically, you might wish to resume an app’s background tasks automatically after the operating system temporarily killed it due to low resources.</p>
<h2 id="alternate-ways-to-launch">Alternate Ways to Launch</h2>
<p>There are a number of ways to launch apps in the background. Here are a few examples:</p>
<p>iOS:</p>
<ul>
<li>CoreLocation region events (enter/exit geofence regions, enter/exit Bluetooth beacon regions.)</li>
<li>CoreBluetooth events (device detection, connect/disconnect and other events.)</li>
<li>CallKit events (incoming Voice over IP calls)</li>
<li>Push Notification events (incoming push notification directed to your app)</li>
<li>Periodic Update Events (typically daily)</li>
</ul>
<p>Android:</p>
<ul>
<li>Bluetooth events (device detection)</li>
<li>Call events (incoming calls of various types)</li>
<li>Phone Boot</li>
<li>Power Connected</li>
<li>Periodic Job Scheduler Events (every ~15 minutes max)</li>
</ul>
<h2 id="ios-launch-sequence">iOS Launch Sequence</h2>
<p>Most iOS apps implement a custom <code class="language-plaintext highlighter-rouge">AppDelegate</code> class which is the central point for receiving global events within the app. Regardless of how an app is launched, it always triggers AppDelegate’s <code class="language-plaintext highlighter-rouge">didFinishLaunchingWithOptions</code> method:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
NSLog("App just launched!")
}
}
</code></pre></div></div>
<p>After this method completes, additional callback methods may be fired, depending on what triggered the launch. And if there is a root view (something most apps have), the app will attempt to start that up, too. (Although if the app was launched in the background, it will not be actually shown on the screen.)</p>
<p>The details of creating any views on app launch depend on the configuration of your app in XCode (for views shown by configuration) or based on code you put in the <code class="language-plaintext highlighter-rouge">didFinishLaunchingWithOptions</code> method above (for views shown programatically). It is important to understand that on iOS, this same process tries to create the same views regardless of whether your app is launched in the foreground or the background.</p>
<h2 id="android-launch-sequence">Android Launch Sequence</h2>
<p>On Android, there is no commonly used equivalent of the iOS <code class="language-plaintext highlighter-rouge">AppDelegate</code> to receive launch and other central events. Android apps more typically have a a number of <code class="language-plaintext highlighter-rouge">Service</code> and <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> declarations declared in AndroidManfest.xml to handle background events. These might look something like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><manifest package="com.example.myapp"
xmlns:tools="http://schemas.android.com/tools"
xmlns:android="http://schemas.android.com/apk/res/android">
...
<application android:icon="@mipmap/icon" android:label="@string/app_name" android:theme="@style/AppTheme">
<service android:enabled="true" android:exported="false" android:name=".MyService"/>
<receiver android:name=".MyBroadcastReceiver" android:exported="false"/>
<activity android:name=".MainActivity" android:exported="true">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
</activity>
</application>
</manifest>
</code></pre></div></div>
<p>However, Android also has what is known as an <code class="language-plaintext highlighter-rouge">Application</code> class that is used by all apps even if they do not customize it. The <code class="language-plaintext highlighter-rouge">Application</code> class always gets a call to the <code class="language-plaintext highlighter-rouge">onCreate</code> method as the very first callback to app code during the launch process, both for foregrounded and backgrounded apps. While it is not as common for Android developers to make a custom version of this, you can do so by simply adding a <code class="language-plaintext highlighter-rouge">android:name=</code> element to the declaration to the <code class="language-plaintext highlighter-rouge"><application></code> tag in AndroidManifest.xml, and provide a corresponding Kotlin or Java implementation for this class.</p>
<p>Here’s the manifest change:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><manifest package="com.example.myapp"
xmlns:tools="http://schemas.android.com/tools"
xmlns:android="http://schemas.android.com/apk/res/android">
...
<application android:name=".MyApplication" android:icon="@mipmap/icon" android:label="@string/app_name" android:theme="@style/AppTheme">
...
</application>
</manifest>
</code></pre></div></div>
<p>And here is the class definition:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class MyApplication: Application() {
...
override fun onCreate() {
super.onCreate()
Log.d(TAG, "App just launched!")
}
}
</code></pre></div></div>
<p>The above <code class="language-plaintext highlighter-rouge">onCreate</code> method will get called whenever and however your app gets launched, even before any <code class="language-plaintext highlighter-rouge">Service</code>, <code class="language-plaintext highlighter-rouge">Activity</code>, or <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> gets created. It is super useful for centralizing app logic in a similar way to iOS, especially from the background.</p>
<p>Unlike iOS, Android typically does not directly start views from the <code class="language-plaintext highlighter-rouge">Application</code> class. This happens from <code class="language-plaintext highlighter-rouge">Activity</code> classes, which are created by the operating system <em>after</em> the <code class="language-plaintext highlighter-rouge">Application</code> class. Whether or not an <code class="language-plaintext highlighter-rouge">Activity</code> gets created or not depends on how the app is launched. If you launch by tapping on the app’s’ icon, Android creates an <code class="language-plaintext highlighter-rouge">Intent</code> that starts up the <code class="language-plaintext highlighter-rouge">Activity</code> with the <code class="language-plaintext highlighter-rouge">android.intent.action.MAIN</code> and <code class="language-plaintext highlighter-rouge">android.intent.category.LAUNCHER</code> declarations in your manifest. If the app is not launched this way, then that <code class="language-plaintext highlighter-rouge">Activity</code> won’t get started up at all.</p>
<p>Android always uses Intents to start up app components, and in addition to <code class="language-plaintext highlighter-rouge">Activity</code> instances, they can trigger <code class="language-plaintext highlighter-rouge">Service</code> and <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> instances. But the <code class="language-plaintext highlighter-rouge">Application</code> instance is special – there is one and only one of them, and it always gets created before anything else. The very first time you start an <code class="language-plaintext highlighter-rouge">Activity</code>, <code class="language-plaintext highlighter-rouge">Service</code> or <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> when the app is not yet running, an <code class="language-plaintext highlighter-rouge">Application</code> is created, then the Intent target gets created (if it doesn’t already exist). The second time you do so, no new <code class="language-plaintext highlighter-rouge">Application</code> gets created. In this way, the Android lifecycle is a bit more complex than iOS.</p>
<p>Expert’s note: ‘For the sake of completeness, it is worth mentioning that there are rare instances where two copies of a single Android app can be running simultaneously with two Application instances. This typical case is for apps that use special Service declarations in their manifest, indicating that the services must run in a separate process. If you aren’t doing that, it is safe to assume your app will always have exactly one Application instance.</p>
<h2 id="ios-and-android-launch-differently">iOS and Android Launch Differently</h2>
<p>Note their is a fundamental difference between iOS and Android background launches. On iOS, the normal views (either <code class="language-plaintext highlighter-rouge">UIViewControlller</code> instances or <code class="language-plaintext highlighter-rouge">SwiftUI</code> instances) get created, and your logic inside those views executes just as if your app was launched by tapping on the icon. The only difference is that the views exist only in the background, and are invisible to the user.</p>
<p>On Android background launches normally do not create views. Any code that you want to execute based on a background launch should not be inside an Activity, but inside an <code class="language-plaintext highlighter-rouge">Application</code>, <code class="language-plaintext highlighter-rouge">Service</code> or <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code>.</p>
<h2 id="app-suspension-and-destruction">App Suspension and Destruction</h2>
<p>On both iOS and Android launched apps continue to exist in memory even they are no longer visible on the screen (or never have been).</p>
<p>On iOS, apps will typically be “suspended” by the operating system a few seconds after they go to the background, unless you make some declarations and do some tricks to get background running time. But suspended doesn’t mean the app is destroyed. It will stick around in memory with the same <code class="language-plaintext highlighter-rouge">AppDelegate</code> instance in case the user starts interacting with it again or an event fires to give it more running time. Only if memory gets low, the user kills the app, or the phone restarts is the app destroyed. At that point, the <code class="language-plaintext highlighter-rouge">AppDelegate</code> instance is gone and the whole process has to start all over again.</p>
<p>On Android, apps are not quickly suspended in the background, although if the phone is motionless with the screen off, “Doze” mode will shut down the CPU and the app won’t get any running time until the phone wakes up. Doze mode aside, code in <code class="language-plaintext highlighter-rouge">Activities</code> and <code class="language-plaintext highlighter-rouge">Services</code> can continue to run, although newer versions of Android will destroy them after about 10 minutes in the background (foreground services are a notable exception).</p>
<p>Even if the Android operating system decides to kill app components after extended background time, the <code class="language-plaintext highlighter-rouge">Application</code> object is typically not destroyed along with <code class="language-plaintext highlighter-rouge">Activity</code>, <code class="language-plaintext highlighter-rouge">Service</code> and <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> instances, which and continues to exist. The only time the <code class="language-plaintext highlighter-rouge">Application</code> object is destroyed is on full app destruction – this happens in low memory or battery conditions. This can also happen in other cases due to manufacturer-specific app killers in closed-source forked variants of Android – something particularly problematic with Chinese OEMs like Huawei, RedMi and OnePlus.</p>
<h2 id="background-launching-example---ios-location-beacon">Background Launching Example - iOS Location Beacon</h2>
<p>Starting up an app in the background on iOS can be tricky – you may have to declare specific background modes in the Info.plist and obtain specific runtime permissions from the user first. For Geofences and Bluetooth Beacons to wake an app, for example, you have to get “always” location permission from the user.</p>
<p>If you have all of the above set up, you can then use the <code class="language-plaintext highlighter-rouge">CoreLocation</code> framework to monitor a beacon region:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
let locationManager = CLLocationManager()
locationManager.delegate = self
locationManager.startMonitoring(for: CLBeaconRegion(uuid: UUID(uuidString: "2F234454-CF6D-4A0F-ADF2-F4911BA9FFA6"), identifier: "my region"))
...
}
func locationManager(_ manager: CLLocationManager, didDetermineState state: CLRegionState, for region: CLRegion) {
if (state == .inside) {
NSLog("App is detecting Bluetooth Beacons matching the identifier pattern in \(region)")
}
else {
NSLog("App is NOT detecting Bluetooth Beacons matching the identifier pattern in \(region)")
}
}
</code></pre></div></div>
<p>In the above code, we start “monitoring” beacon regions using iBeacon – looking for when beacons become visible (“inside region”) and invisible (“outside region”). The code simply logs each time an inside/outside transition happens.</p>
<p>But the magical thing about iBeacon on iOS is that the transition between inside to outside or outside to inside can launch an app into the background. This is true even if the user killed the app manually, or has rebooted the phone since the last time the app was used.</p>
<p>Here’s the sequence:</p>
<ol>
<li>The iOS operating system’s Bluetooth radio detects the previously registered beacon identifier pattern in an over the air advertisement.</li>
<li>The IOS operating systems’s CoreLocation framework launches the app.</li>
<li>The app starts up, calling the <code class="language-plaintext highlighter-rouge">didFinishLaunchingWithOptions</code> method of the AppDelegate, which itself sets up the CoreLocation delegate for any queued callbacks from the launch.</li>
<li>CoreLocation calls the <code class="language-plaintext highlighter-rouge">didDetermineState</code> delegate method to let the app know about the changes in beacon detections.</li>
</ol>
<h2 id="background-launching-example---android-location-beacon">Background Launching Example - Android Location Beacon</h2>
<p>Background startup on Android can be similarly tricky, requiring you to configure the Manifest properly and obtain specific runtime permissions from the user first. For Bluetooth Beacons to wake an app, for example, you have to get <code class="language-plaintext highlighter-rouge">BLUETOOTH_SCAN</code> and <code class="language-plaintext highlighter-rouge">BACKGROUND_LOCATION</code> permissions from the user. Once this is set up, you can scan for Bluetooth beacons like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>class MyApplication: Application() {
...
override fun onCreate() {
super.onCreate()
Log.d(TAG, "App just launched!")
val filters = ArrayList<ScanFilter>()
// Scan for any Bluetooth device with a specific hardware address.
val filter = ScanFilter.Builder().setDeviceAddress("aa:bb:cc:dd:ee:ff").build()
filters.add(filter)
val settings = ScanSettings.Builder().setScanMode(ScanSettings.SCAN_MODE_LOW_LATENCY).setCallbackType(ScanSettings.CALLBACK_TYPE_ALL_MATCHES).build()
val scanPendingIntent = PendingIntent.getBroadcast(context,0,Intent(this, MyBroadcastReceiver::class.java), PendingIntent.FLAG_UPDATE_CURRENT or PendingIntent.FLAG_MUTABLE)
val bluetoothManager = this.getSystemService(Context.BLUETOOTH_SERVICE) as BluetoothManager
bluetoothManager.bluetoothAdapter.bluetoothLeScanner.startScan(filters, settings, scanPendingIntent)
}
}
class MyBroadcastReceiver: BroadcastReceiver {
override fun onReceive(context: context, intent:Intent) {
val bleCallbackType = intent?.getIntExtra(
BluetoothLeScanner.EXTRA_CALLBACK_TYPE,
-1
)
if (bleCallbackType != -1) {
val scanResults = intent?.getParcelableArrayListExtra<ScanResult>(BluetoothLeScanner.EXTRA_LIST_SCAN_RESULT)
for (scanResult in scanResults ?: ArrayList()) {
Log.d(TAG, "The app just detected a Bluetooth beacon: ${scanResult.scanRecord}")
}
}
}
}
</code></pre></div></div>
<p>In the above code, we start scanning for all Bluetooth devices matching a specific hardware MAC address and logs whenever it is detected. Note that the code to set this up is lower level than that on iOS, because Android has no native support For Bluetooth Beacons. For easier setup that makes things look a lot more like on iOS, you can try my <a href="https://altbeacon.github.io/android-beacon-library/">Android Beacon Library</a>. But I want to show the nitty gritty details here so you see how background launching works. Internally that library does the equivalent of the above.</p>
<p>Just like on iOS, the code above will auto launch the app in the background on a new Beacon detection.</p>
<p>Here’s the sequence:</p>
<ol>
<li>The Android operating system’s Bluetooth radio sees an advertisement with a Bluetooth hardware identifier matching a <code class="language-plaintext highlighter-rouge">ScanFilter</code> set up in the above code.</li>
<li>The Android operating systems’s Bluetooth component creates an <code class="language-plaintext highlighter-rouge">Intent</code> based on the <code class="language-plaintext highlighter-rouge">scanPendingIntent</code> in the above code. This Intent matches <code class="language-plaintext highlighter-rouge">MyBroadcastReceiver</code>, so Android knows to launch that component.</li>
<li>The app starts up, calling the <code class="language-plaintext highlighter-rouge">onCreate</code> method of the <code class="language-plaintext highlighter-rouge">Application</code> class.</li>
<li>Android constructs <code class="language-plaintext highlighter-rouge">MyBroadcastReceiver</code> and calls the <code class="language-plaintext highlighter-rouge">onReceive</code> method to let the app know about the changes in beacon detections.</li>
</ol>
<h2 id="keeping-the-app-running-in-the-background">Keeping the App Running in the Background</h2>
<p>As discussed before, both iOS and Android have limits to how long apps can run in the background. Even though your app is launched into the background, it may only get a few seconds of run time after the launch before it is suspended. Extending this run time is possible. Using advanced techniques, you can keep apps running in the background pretty much forever. Those details are a topic for another post.</p>
<h2 id="what-launched-my-app">What Launched My App?</h2>
<p>If you have multiple events launching your app in the background things can get confusing. During debugging you may wonder, “what launched my app?” How you figure this out is quite different between iOS and Android. Because it is much simpler on iOS, let’s start there.</p>
<p>On iOS, you can simply check the options dictionary in <code class="language-plaintext highlighter-rouge">didFinishLaunchingWithOptions</code>:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
if let launchOptionKey = launchOptions?.keys.first {
if launchOptionKey == .location {
NSLog("This app was launched by CoreLocation.")
}
NSLog("Launch option info is: \(String(describing: launchOptions?[launchOptionKey]))")
}
else {
NSLog("This app was launched manually by the user")
}
...
</code></pre></div></div>
<p>You can see the full list of <code class="language-plaintext highlighter-rouge">UIApplication.LaunchOptionsKey</code> values <a href="https://developer.apple.com/documentation/uikit/uiapplication/launchoptionskey">here</a>. This documents some of the several ways that iOS apps may be launched into the background.</p>
<p>Also note that the LaunchOptions dictionary also include both keys and values. The values may give more specifics about why the launch type indicated by the key happened (e.g. a specific Geofence entry/exit for location, a specific URL being handled by your app, etc.) Details of the information you get vary by launch type.</p>
<h2 id="determining-app-launch-reason-on-android">Determining App Launch Reason on Android</h2>
<p>On Android, figuring out why an app was launched is not so simple. The <code class="language-plaintext highlighter-rouge">Application#onCreate</code> method has no equivalent to the iOS LaunchOptions mentioned above. In fact, there are no public APIs that you can use to see why the app was launched from inside this method.</p>
<p>This is true because of the differences in Android’s architecture, which organizes apps via <code class="language-plaintext highlighter-rouge">Activity</code>, <code class="language-plaintext highlighter-rouge">Service</code>, and <code class="language-plaintext highlighter-rouge">BroadcastReceiver</code> components. Any of these three component types can be started up via in Android <code class="language-plaintext highlighter-rouge">Intent</code>, and if the parent app is not already running the <code class="language-plaintext highlighter-rouge">Application</code> is created first and <code class="language-plaintext highlighter-rouge">Application#onCreate</code> is called. Only after this method finishes executing does Android start up the actual component being launched.</p>
<p>If you have an app that creates these three types of components and want to know which started up the app in the <code class="language-plaintext highlighter-rouge">Application</code> class, you can add code to these components to <strong>tell</strong> the <code class="language-plaintext highlighter-rouge">Application</code> class how it was launched in a separate method that executes <strong>after</strong> <code class="language-plaintext highlighter-rouge">onCreate</code>. The example below shows how this might be done. Note that a new custom <code class="language-plaintext highlighter-rouge">onComponentStart</code> method in your Application class will log each time an app component starts up, and indicate whether this launched the entire app, or whether the app was already started when the component started. Again, this new <code class="language-plaintext highlighter-rouge">onComponentStart</code> call will be <em>after</em> <code class="language-plaintext highlighter-rouge">onCreate</code>, because that is just how Android works.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Modify your application class like this:
class MyApplication: Application() {
var launchComponent: Any? = null
fun onComponentStart(component: Any, intent: Intent?) {
var componentStartType = "started after previous app launch"
if (launchComponent == null) {
componentStartType = "launched app"
launchComponent = component
}
// Component launched app: MainActivity with intent Intent { act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10000000 cmp=org.altbeacon.beaconreference/.MainActivity }
// Component started after previous app launch: MainActivity with intent Intent { flg=0x10000000 cmp=org.altbeacon.beaconreference/.MainActivity }
Log.d(TAG, "Component $componentStartType: ${component.javaClass.simpleName} with intent: $intent ")
}
...
}
// add this to each Activity or Service
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
(this.application as MyApplication).onComponentStart(this, intent)
...
}
...
}
// add this to each Broadcast Receiver
class MyBroadcastReceiver: BroadcastReceiver {
override fun onReceive(context: context, intent:Intent) {
(this.application as MyApplication).onComponentStart(this, intent)
...
}
}
</code></pre></div></div>
<p>Take care that if you app uses third party libraries that expose other Activity<code class="language-plaintext highlighter-rouge">, </code>Service<code class="language-plaintext highlighter-rouge">, and </code>BroadcastReceiver` components, then the above changes won’t be enough – you’ll also need to modify those third party components to do the same.</p>
Forever Ranging2023-02-10T00:00:00+00:00http://davidgyoungtech.com/2023/02/10/forever-ranging<p>Apple has always placed restrictions on how long iPhone apps can range for Bluetooth beacons in the background. While apps can range for unlimited periods of time as long as they are in the foreground (visible on teh screen), once they go to the background, ranging updates generally stop after 10 seconds. They may resume again for another 10 seconds on certain events like a beacon region entry or exit, but it won’t last long.</p>
<h2 id="make-it-last">Make It Last</h2>
<p>The good news is that it is possible to keep ranging going forever. Setting it up is a bit tricky, and will use significant battery if you are around beacons for long periods of time. But is legal to do for App Store distribution provided that your app makes it clear that it uses location in the background for an approved purpose.</p>
<p>Here’s what you need to do:</p>
<h3 id="app-setup">App Setup:</h3>
<ol>
<li>Put the following declaration in your Info.plist
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><key>UIBackgroundModes</key>
<array>
<string>location</string>
</array>
</code></pre></div> </div>
</li>
<li>Write code to obtain <code class="language-plaintext highlighter-rouge">NSLocationAlways</code> permission from the user.</li>
</ol>
<h3 id="starting-ranging">Starting Ranging</h3>
<p>When it’s time to start ranging, you’ll need to perform some additional steps.</p>
<ol>
<li>Start beacon ranging using the normal APIs.</li>
<li>Start location updates at the lowest power setting (basically just using cell radio) with:
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> locationManager.pausesLocationUpdatesAutomatically = false
locationManager.desiredAccuracy = kCLLocationAccuracyThreeKilometers
locationManager.distanceFilter = 3000.0
if #available(iOS 9.0, *) {
locationManager.allowsBackgroundLocationUpdates = true
} else {
// not needed on earlier versions
}
// start updating location at beginning just to give us unlimited background running time
locationManager.startUpdatingLocation()
</code></pre></div> </div>
</li>
<li>Start a background task:
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> var backgroundTask: UIBackgroundTaskIdentifier = UIBackgroundTaskIdentifier.invalid
NSLog("Attempting to extend background running time")
self.backgroundTask = UIApplication.shared.beginBackgroundTask(withName: "DummyTask", expirationHandler: {
NSLog("Background task expired by iOS.")
UIApplication.shared.endBackgroundTask(self.backgroundTask)
})
var lastLogTime = 0.0
DispatchQueue.global().async {
let startedTime = Int(Date().timeIntervalSince1970) % 10000000
NSLog("*** STARTED BACKGROUND THREAD")
while(true) {
DispatchQueue.main.async {
let now = Date().timeIntervalSince1970
let backgroundTimeRemaining = UIApplication.shared.backgroundTimeRemaining
if abs(now - lastLogTime) >= 2.0 {
lastLogTime = now
if backgroundTimeRemaining < 10.0 {
NSLog("About to suspend based on background thread running out.")
}
if (backgroundTimeRemaining < 200000.0) {
NSLog("Thread \(startedTime) background time remaining: \(backgroundTimeRemaining)")
}
else {
//NSLog("Thread \(startedTime) background time remaining: INFINITE")
}
}
}
sleep(1)
}
NSLog("*** EXITING BACKGROUND THREAD")
}
</code></pre></div> </div>
</li>
</ol>
<p>The above will keep giving you ranging updates forever, even in the background, until your phone reboots, the user kills your app, or the operating system kills your app due to low memory.</p>
<p>The data from the location update is not needed, but turning on the location update is a signal to the operating system (combined with the location background mode we set up in the Info.plist) that it should let you app keep running in the background indefinitely. The background task lets iOS know that it is processing this data and therefore needs the running time.</p>
<h2 id="battery-impact">Battery Impact</h2>
<p>Whenever you have beacon ranging enabled on iOS, the Bluetooth chip is doing a scan at the highest duty cycle. The radio receiver is on 100% of the time. While modern iPhones and their batteries are pretty tolerant to Bluetooth scanning for limited periods of time, if you leave this on all day, the user will notice the battery draining about twice as quickly as normal.</p>
<p>So be careful with this – don’t just turn on ranging forever without letting the user know (and getting user consent) to do something that will use a lot of battery.</p>
<p>The background location update we requrested, on the ohter hand, uses very little battery. This is because the desired accuracy we requested, <code class="language-plaintext highlighter-rouge">kCLLocationAccuracyThreeKilometers</code>, only uses the cell radio for location. Each time the cell tower changes, the app gets a location update from that cell tower. Since the cell radio is on anyway, this causes no additional battery drain.</p>
<h2 id="safe-for-the-app-store">Safe for the App Store</h2>
<p>All of the above are legal techniques appropriate for apps to be published in the AppStore. However, becuase you are requesting a location backround mode in Info.plist, your app must have some obvious user-facing benefit for using location in the backround. If you have an app that natigates you around an obsticle course or tracks your umbrella to keep you from leving it behind, this will probably be okay.</p>
<p>But if your app is a video game that passively looks for beacons to send location-targeted advertising to the user, that will probably be rejected.</p>
The Rise and Fall of the Foreground Service2022-06-25T00:00:00+00:00http://davidgyoungtech.com/2022/06/25/the-rise-and-fall-of-the-foreground-service<p><img src="/images/services.png" alt="" width="320px" /></p>
<p>Once upon a time, building an Android app to do work in the background was easy: Just build an Android Service, and you could run whatever code you wanted for as long as you wanted. But things started to change with Android 8 and again with Android 12. <strong>Today, Android Services are often more trouble than they are worth.</strong></p>
<h2 id="the-rise-of-the-foreground-service">The Rise of the Foreground Service</h2>
<p>With Android versions 8-11, Google limited Android Services to run only about 10 minutes in the background before being force stopped by the operating system. Yet Google left a big loophole with what is known as a Foreground Service. These once rarely used service types were introduced back in 2009 with Android 2.0. They work like a regular service except they also show a persistent notification to indicate to the user that the service is running.</p>
<p>Foreground Services have always been confusing to developers, not least because the name is super misleading. Apps running Foreground Services, do nothing to alter whether your app is treated as being “in the foreground” by Android. An Android app is considered in the foreground if and only if the screen is not locked and a visible activity is on the screen (not just a notification). If either of these conditions is false, the app is treated as if it is in the background by the operating system, and any restrictions for backgrounded apps still apply – regardless of whether there is a Foreground Service active. This is a critical distinction, because over the years, Android has added lots of little restrictions on what apps can do when they are not in the foreground.</p>
<p><strong>But Google’s Android 8 loophole for Foreground Services made them super popular.</strong> A Foreground Service, unlike a regular Service, could run for an unlimited amount of time in the background. An app would just have to make this obvious to the user by having that persistent notification visible. This is what Google Maps does to keep running navigation in the background on those multi-hour car trips.</p>
<h2 id="the-fall-of-the-foreround-service">The Fall of the Foreround Service</h2>
<p>This worked great for four years, until Google cracked down on Services again last year with Android 12. The new restrictions limit the specific times an app could start a Foreground Service. If the app is in the foreground already, no problem. But <strong>if the app is in the background, then starting a foreground service is generally prohibited</strong> – unless one of a dozen often <a href="https://developer.android.com/guide/components/foreground-services#background-start-restriction-exemptions">obscure special cases</a> apply.</p>
<p>What’s worse, it is often unclear whether a special case applies, and there is no API to ask Android if one applies. You just have to try to start the service and see if Android throws an exception:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>try {
val intent = new Intent(context, MyService)
context.startForegroundService(intent)
}
catch (e: ServiceStartNotAllowedException) {
// This happens on Android 12+ if you try to start a service from the background
// without a qualifying event. Why was it not allowed? What the heck do we do now?
}
</code></pre></div></div>
<p>This can be very frustrating. For example, the <a href="https://developer.android.com/guide/components/foreground-services#background-start-restriction-exemptions">Android docs</a> say that any event requiring a BLUETOOTH_SCAN permission should allow a Foreground Service to be started. But whenever I tried the above code right after receiving a Bluetooth scan callback, I always got the exception. After spending hours digging through Android Open Source files, I saw no evidence of any special privileges granted by Bluetooth scans – only by Bluetooth connection events. So I needed alternatives.</p>
<h2 id="the-death-of-the-android-service">The Death of the Android Service?</h2>
<p>Android’s official docs suggest using a WorkManager, which is just a fancy backward-compatible wrapper around scheduled jobs. A JobService is a specialized type of Android Service designed for one time or periodic tasks lasting up to a few minutes.</p>
<p>But sometimes you need to perform tasks all the time in near real-time. This is often the case with Bluetooth or other radio signal-based apps that require you perform work at a very specific time when another radio happens to be nearby. You can’t just wait up to 25 minutes for the Android job scheduler to kick in and hope that the external radio will still be around at that time.</p>
<p>Given that Android Services have become such a pain, why do we use them at all?</p>
<p>When I originally posed this question, it was rhetorical. I believed that Android Services are the only way to do work in the background, or that Android Services get special privilege to run that code running outside services do not. But then I checked my premises. And as it turns out, it’s simply not true. The fact of the matter is that Android Services are simply not needed to run code in the background. When they are used, they give you no special background running capabilities. In fact, as we have described above, Android Services unleashes a whole host of new restrictions on running in the background.</p>
<p>The only reason apps use Android Services for background work is because Android’s original designers intended them to be used that way, and designed a bunch of APIs into the SDK to make it easy to set up componentized background code and have each component be able to communicate with other components. This made great sense back when Android Services were flexible and unhindered by today’s restrictions. Why not use the tools that Android designers laid out for you? As people did this, more and more tutorials were written and questions answered on StackOverflow.com. Today, Android developers generally believe that if you want to run code in the background, you must use some kind of Android Service, even if it is relegated to a JobService managed by the WorkManager.</p>
<p>But as Yoda would say, you must unlearn. You don’t have to use Android Services. The constructs made sense once upon a time, but today they are like an over-regulated municipality. The Android citizenry needs to wake up, and move outside the city limits to regain its freedom.</p>
<h2 id="an-alternative-approach">An Alternative Approach</h2>
<p>Here’s an example of what you can do instead: use a combination of BroadcastReceivers and threads. Much like a Service, a BroadcastReceiver can be used to receive all kinds of sensor, radio, or other events (BOOT_COMPLETED is a super important one) and trigger your app to start executing code. You declare one in your AndroidManifest.xml like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><receiver android:name="MyBroadcastReceiver" android:exported="false">
<intent-filter>
<action android:name="android.intent.action.BOOT_COMPLETED"/>
</intent-filter>
</receiver>
</code></pre></div></div>
<p>And then write code to back it like this:</p>
<div class="language-kotlin highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="kd">class</span> <span class="nc">MyBroadcastReceiver</span><span class="p">:</span> <span class="nc">BroadcastReceiver</span><span class="p">()</span> <span class="p">{</span>
<span class="k">override</span> <span class="k">fun</span> <span class="nf">onReceive</span><span class="p">(</span><span class="n">context</span><span class="p">:</span> <span class="nc">Context</span><span class="p">?,</span> <span class="n">intent</span><span class="p">:</span> <span class="nc">Intent</span><span class="p">?)</span> <span class="p">{</span>
<span class="k">if</span> <span class="p">(</span><span class="nc">Intent</span><span class="p">.</span><span class="nc">ACTION_BOOT_COMPLETED</span><span class="p">.</span><span class="nf">equals</span><span class="p">(</span><span class="n">intent</span><span class="o">?.</span><span class="n">action</span><span class="p">))</span> <span class="p">{</span>
<span class="nc">Log</span><span class="p">.</span><span class="nf">d</span><span class="p">(</span><span class="nc">TAG</span><span class="p">,</span> <span class="s">"The phone just booted."</span><span class="p">)</span>
<span class="c1">// Set up for any other programmatic broadcasts here</span>
<span class="kd">val</span> <span class="py">filter</span> <span class="p">=</span> <span class="nc">IntentFilter</span><span class="p">(</span><span class="nc">BluetoothAdapter</span><span class="p">.</span><span class="nc">ACTION_CONNECTION_STATE_CHANGED</span><span class="p">)</span>
<span class="n">context</span><span class="o">?.</span><span class="nf">registerReceiver</span><span class="p">(</span><span class="k">this</span><span class="p">,</span> <span class="n">filter</span><span class="p">)</span>
<span class="p">}</span>
<span class="c1">// This cannot execute for more than 10 seconds otherwise Android kills your app</span>
<span class="p">}</span>
<span class="p">}</span>
</code></pre></div></div>
<p>In the example above, the BOOT_COMPLETED is often useful, because as of Android 8, lots of other broadcasts in Android are not allowed to be registered in the Manifest — you have to register for them programatically. This gives you an easy way to do that so your app is always listening for them.</p>
<p>The trick with a BroadcastReceiver is that the operating system only gives you 10 seconds to exit the onReceive method (called on the main thread) before it kills your app with an Application Not Responding condition. But don’t worry, just start a new thread. Then you can execute whatever code you want however long you want. Like this:</p>
<div class="language-kotlin highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kd">val</span> <span class="py">executor</span> <span class="p">=</span> <span class="nc">Executors</span><span class="p">.</span><span class="nf">newFixedThreadPool</span><span class="p">(</span><span class="mi">1</span><span class="p">)</span>
<span class="n">executor</span><span class="p">.</span><span class="nf">execute</span><span class="p">(</span><span class="nc">Runnable</span> <span class="p">{</span>
<span class="k">while</span> <span class="p">(</span><span class="k">true</span><span class="p">)</span> <span class="p">{</span>
<span class="nc">Log</span><span class="p">.</span><span class="nf">d</span><span class="p">(</span><span class="nc">TAG</span><span class="p">,</span> <span class="s">"I will log this line every 10 seconds forever"</span><span class="p">)</span>
<span class="nc">Thread</span><span class="p">.</span><span class="nf">sleep</span><span class="p">(</span><span class="mi">10000</span><span class="p">)</span>
<span class="p">}</span>
<span class="p">})</span>
</code></pre></div></div>
<p>The code above doesn’t do much – it just prints out a log line very 10 seconds forever. But you can do whatever you want here – scan for Bluetooth devices, make network calls, etc. And because the code runs on its own thread, not the main thread, it doesn’t violate any Android rules. Voila! We are now running code indefinitely regardless of Android’s service restrictions!</p>
<p>I used these same techniques to make the Android Beacon Library be able to range for Bluetooth Beacons continually without an Android Foreground Service.</p>
<p>You can try this example yourself in <a href="https://github.com/davidgyoung/Serviceless">this Gitub repo</a>. The app has no UI – just a blank white screen. But run the code and view LogCat and watch its logging keep going. The code in the repo organizes things a bit more by adding a custom Application class, and moving all the initialization to the onCreate method instead of the BroadcastReceiver. Putting code in the Application onCreate method is a great way to ensure code is executed just once each time something starts the app (unless you configure a multi-process app – and please don’t!) If you run this code it will keep logging every 10 seconds forever. If you reboot the phone, it will automatically restart and do the same.</p>
<p>Of course, this roll-your-own approach doesn’t mean that the code actually will run forever. The operating system will still go into deep sleep and shut down the CPU at various times, causing the code to pause for awhile before resuming. (This is also the way code running inside Android Services behave.) And your app can still crash, get terminated due to low memory, or be shut down by proprietary task killers (also just like regular Android Services). But these are normal caveats, and there are solutions for all these issues. The key point is that the sample code doesn’t break any Android rules – it is perfectly legal to not use Android Services for long-running background work.</p>
<p>Some will certainly argue that such a roll-your-own approach is a bad practice, and that you should use standard design frameworks like Android Services whenever possible. While this may be true, it is also true that the increasing restrictions that Google has placed on Android Services mean that “whenever possible” is becoming increasingly rare.</p>
Eddystone is Dead, Long Live Eddystone!2021-10-09T00:00:00+00:00http://davidgyoungtech.com/2021/10/09/eddystone-is-dead-long-live-eddystone<p>If you’ve used Google products for more a few years, you probably know to approach them with caution. A Google product can work great and have a loyal following, but that doesn’t mean the company won’t <a href="https://killedbygoogle.com">axe it with short notice like hundreds of other products</a>, leaving users howling with frustration.</p>
<p>For users of Google’s Bluetooth beacon platform, 2021 was the year to howl. In April, the company shut down their cloud-based system for managing your beacons and Google Play Services delivering associated content to third party Android and iOS apps. The system was never hugely popular, but folks who built their apps and products on top of these services were thrown under the bus by the Silicon Valley giant.</p>
<h2 id="eddystone-is-not-deprecated">Eddystone is Not Deprecated</h2>
<p>This abandoned system is often confused with Google’s Eddystone Bluetooth beacon standard. Many people mistakenly think the Eddystone format itself was deprecated by Google. Not so. This open beacon standard was released in 2015 to compete with Apples’ proprietary iBeacon standard (long ago <a href="http://www.davidgyoungtech.com/2013/10/01/reverse-engineering-the-ibeacon-profile">reverse engineered and published</a>) and the open source <a href="https://github.com/AltBeacon/spec">AltBeacon standard.</a></p>
<p>Beacon formats are called “standards” because they are just simple blueprints that anybody can implement. Think of it like a Phillips screwdriver. While any one company can stop making screws or tools that meet the Phillips screwdriver standard, everybody else can just keep making and using them. So it is with Eddystone and its Eddystone-UID, Eddystone-TLM, Eddystone-URL variants.</p>
<p>Eddystone and AltBeacon beacons formats are in the public domain (they have open source license from Google and Radius Networks, respectively) and nobody can stop folks from using them. And to be clear, Google never even suggested people should stop using them. To this day, Google continues to <a href="https://github.com/google/eddystone">publish the open Eddystone standards on their Github account</a>. The only thing they have deprecated are their beacon cloud platform that worked with not just Eddystone but also iBeacon and AltBeacon.</p>
<h2 id="eddystone-without-google-services">Eddystone Without Google Services</h2>
<p>Bluetooth beacons, Eddystone or otherwise, continue to work fine on Android phones. Eddystone beacon formats not only work fine without Google’s services, they are even more valuable, more stable, and more secure. That’s because you no longer have to worry about Google sharing your beacon data or throwing your company under the bus by shutting down required services. Any time you build a system that relies on third party services like the Google Beacon Platform, you are exposing yourself to a huge risk. If that company abandons the service or changes the terms in some unacceptable way, you have to start over.</p>
<p>Back when Google was first developing Eddystone, I was working from Radius Networks (which had developed the open source AltBeacon standard) and we had several collaborative meetings with Google’s Eddystone team. I quickly realized that Google strategists were using beacons as a vehicle to push people into using their cloud services, probably so that they could use and monetize third party beacon network themselves. To me, this sounded like a terrible idea for the beacon community.</p>
<p>At the time, my boss told me that this was the end of the <a href="https://altbeacon.github.io/android-beacon-library/">Android Beacon Library</a>, my open-source SDK that had become the de-facto standard on Android – Google, he said, was taking over. I smiled to myself, thinking of Google’s short attention span. Six years later, the Google behemoth has thrown in the towel, but the Android Beacon Library (developed largely by one guy) is still going strong. Oh, and did I mention my Android Beacon Library still supports Eddystone, too?</p>
<h2 id="the-death-of-eddystone-eid">The Death of Eddystone-EID</h2>
<p>There is one Google beacon format that is dead for most people – Eddystone EID. This is special format that uses a rotating crypto hash of the beacon identifier to make it harder for other people to spoof a beacon or freeload by using its transmission without permission. It relies on a piece of server software called a “trusted resolver” to convert the scrambled identifier transmitted over the air to an identifier that is consistent and usable.</p>
<p>While Eddystone-EID is an open standard, it is pretty much worthless without a “trusted resolver”. Because Google’s deprecated cloud platform was the only publicly available “trusted resolver” for Eddystone-EID, if you want to use this format today, you have to build your own. Building a trusted resolver is a non-trivial exercise – I know because I had to build my own to test the Eddystone-EID beacons I was making before Google opened theirs up to the public. I don’t recommend building one yourself.</p>
<h2 id="why-google-cloud-services-abandoned-eddystone">Why Google Cloud Services Abandoned Eddystone</h2>
<p>While Google engineers had some great ideas with Eddystone, some of the company’s ideas were terrible. The original idea of the Eddystone-URL was called the “physical web” meaning that you would connect the World Wide Web with the physical world by having beacon transmitters send out a URL to bring up on your phone. A good example of where this might be useful is at a historical signpost. Instead of a $2,000 bronze plaque with 200 words, you could just put up a $20 beacon and have it send a link to your phone, allowing you to bring up the Wikipedia page with as much info as you’d ever want to know.</p>
<p>But then Google’s ad team got their hands on the engineers’ idea. They pushed a product called Google Nearby that would show beacon-based notifications directly on your Android phone, and bundled this with Google Play Services, which is installed on most Android phones outside of China. You can guess what happened next – users were bombarded with spammy notifications about shoe sales. In the days where notification fatigue was still setting in, Google was on the cutting edge of annoying us. Somebody in the company with half a brain finally shut down this service in 2018. But come on, who couldn’t see this coming?</p>
<p>As for the rest of Google’s beacon cloud platform, nobody can say for sure why it finally got the axe. Why does Google kill any of the hundreds of products it abandons? There are certainly equipment and labor costs to keeping beacon services going – the operations team needs to keep the servers up and the support team needs to answer developer questions. If Google Nearby had not been a big disaster and if the cloud platform had otherwise been more popular it might have had a chance. But like at any big company, the final decision usually comes down to the lack of an executive willing to defend a product.</p>
<h2 id="should-you-still-use-bluetooth-beacons-and-eddystone">Should You Still Use Bluetooth Beacons and Eddystone?</h2>
<p>Yes! Google’s short attention span shouldn’t influence the rest of us. There are all kinds of new and exciting use cases for bluetooth beacons that haven’t even been considered yet. Not only are iBeacon and AltBeacon formats great to use in the future, most Eddystone formats are too (except maybe Eddystone-EID).</p>
<p>Eddystone-UID, in fact, has some niche advantages over both AltBeacon and iBeacon. When making iOS apps, the iBeacon format is almost always the best choice as it is detected very quickly. But iBeacon has the disadvantage of only having four usable bytes (the major and minor) for data transfer on iOS. The AltBeacon format gives you much more space for data transfer, but it cannot be detected by a backgrounded app. Eddystone-UID, however, offers the best of both worlds: larger space for data transfer and it is able to be detected in the background on iOS. Because of this advantage, I regularly use Eddystone-UID on new projects where data transfer in the background on iOS is important.</p>
<p>Just because Google’s web services can’t find a use for Eddystone doesn’t mean that you can’t. The Phillips screw company has long since moved on from their namesake screw patent from 75 years ago. But their screw standard remains the most popular in the world today. Likewise, the Eddystone standard will remain widely used outside of Google for many decades to come.</p>
Is That a Scanner in Your Pocket?2021-08-15T00:00:00+00:00http://davidgyoungtech.com/2021/08/15/is-that-a-scanner-in-your-pocket<h2 id="bluetooth-scanning-with-embedded-systems">Bluetooth Scanning With Embedded Systems</h2>
<p>Since the invention of Bluetooth LE, the most common use case has been to use a large device like a phone or laptop to scan for and connect to small peripheral devices like earpieces, health trackers and various remote sensors. But some of the more innovative projects turn this idea upside down, building small devices that do Bluetooth LE scanning themselves, looking for phones, laptops, cars, and even other small peripherals that happen to be nearby.</p>
<p>During the pandemic, countries like Singapore used small Bluetooth scanners like this to bring automated contact tracing to elderly residents without smart phones. All kinds of other use cases are possible including making hardware devices sense their environment and even giving them social awareness.</p>
<p>It’s common knowledge in the tech community that while Andorid and iOS devices are usually used in Bluetooth LE central mode (scanning for and connecting to peripherals), they can also work in peripheral mode (advertising themselves so others may scan for them.) What tech folks less often realize is that many embedded platforms allow the same kind of dual operation. A small plastic package nearby might be scanning you right now.</p>
<h3 id="choose-your-foundation-carefully">Choose Your Foundation Carefully</h3>
<p>If you want to build a product that relies on distributed Bluetooth scanning, you’ve got to pick your hardware platform wisely as it will be the foundation of your system. Special care should be taken to figure out your budget for per unit hardware, up-front hardware design and manufacturing, software development, and power. You also need to figure out how you are going to access the information gathered by the device. You may be able to use onboard networking (e.g. WiFi), celular, or special radio networks like SigFox or LoRa. In a pinch you can transfer data using Bluetooth connections, too.</p>
<p>Below is a comparison of some of the different scanning platforms available, showing the code needed to set up scanning on each.</p>
<h4 id="ios-and-android">iOS and Android</h4>
<p>These platforms excel in that they require no custom manufacturing, they are easy to program, have integrated cellular and WiFi radios, and they have relatively large built in batteries. Android models may be quite cheap as well. It’s really hard to complete with existing manufacturing economies of scale. Take advantage of this, and don’t build your own hardware if you don’t have to.</p>
<p>Of the two platforms, iOS is much less flexible for scanning. You cannot control the scan rate, and unless your code is running with a visible app on the screen, the scan rate will be low. If the screen is on, your battery drain will be terrible. If you can live with these serious limitations, you can start a scan like this (using Swift):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>let centralManager = CBCentralManager(delegate: self, queue:DispatchQueue.global(qos: .default))
centralManager.scanForPeripherals(withServices: nil, [CBCentralManagerScanOptionAllowDuplicatesKey: true])
...
// When devices are detected, this method will be called
func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber) {
if let serviceData = advertisementData["kCBAdvDataServiceData"] as? [NSObject:AnyObject] {
// TODO: do something with the advertisementData here
}
}
</code></pre></div></div>
<p>Android, by contrary, gives you much more flexibility. You can scan with the screen off at higher rates. All kinds of form factors are available and you can buy very cheap reference designs. Just be sure to test them for quality before you go too far. With Android starting a scan is like this (using Java):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>BluetoothManager bluetoothManager = (BluetoothManager) context.getApplicationContext().getSystemService(Context.BLUETOOTH_SERVICE);
BluetoothAdapter bluetoothAdapter = bluetoothManager.getAdapter();
BluetoothLEScanner scanner = bluetothAdapter.getBluetoothLeScanner();
scanner.startScan(scanCallback);
...
ScanCallback scanCallback = new ScanCallback() {
// When devices are detected, this method will be called
@Override
public void onScanResult(int callbackType, ScanResult scanResult) {
// TODO: do something with the scanResult here
}
...
};
</code></pre></div></div>
<h4 id="raspberry-pi">Raspberry Pi</h4>
<p>A number of Raspberry Pi form factors are available with Bluetooth LE built in. Most models also have WiFi connectivity for offloading data. The hardware is quite cheap and requires no custom manufacturing unless you want a snazzy case than you can buy off the shelf. Programming is almost as easy as on iOS and Android. While the BlueZ Bluetooth stack is fiddly and not well understood, experienced Linux C programmers are not difficult to find and can get the job done.</p>
<p>The real drawback is power. These devices are designed for a wired USB power supply. If you can plug them into a wall, great. If not, expect to pair them with a large capacity USB battery. Battery consumption is much worse than an Android or iOS device – neither the hardware nor Linux are optimized to save power.</p>
<p>Here’s how you can start a scan with BlueZ (using C):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>le_set_scan_parameters_cp scan_params_cp;
memset(&scan_params_cp, 0, sizeof(scan_params_cp));
scan_params_cp.type = 0x00;
scan_params_cp.interval = htobs(0x0010);
scan_params_cp.window = htobs(0x0010);
scan_params_cp.own_bdaddr_type = 0x00;
scan_params_cp.filter = 0x00;
struct hci_request scan_params_rq = ble_hci_request(OCF_LE_SET_SCAN_PARAMETERS, LE_SET_SCAN_PARAMETERS_CP_SIZE, &status, &scan_params_cp);
ret = hci_send_req(device, &scan_params_rq, 1000);
le_set_scan_enable_cp scan_cp;
memset(&scan_cp, 0, sizeof(scan_cp));
scan_cp.enable = 0x01;
scan_cp.filter_dup = 0x00;
struct hci_request enable_adv_rq = ble_hci_request(OCF_LE_SET_SCAN_ENABLE, LE_SET_SCAN_ENABLE_CP_SIZE, &status, &scan_cp);
ret = hci_send_req(device, &enable_adv_rq, 1000);
uint8_t buf[HCI_MAX_EVENT_SIZE];
evt_le_meta_event * meta_event;
le_advertising_info * adv_info;
int len;
while ( true ) {
int len = read(device, buf, sizeof(buf));
if ( len >= HCI_EVENT_HDR_SIZE ) {
meta_event = (evt_le_meta_event*)(buf+HCI_EVENT_HDR_SIZE+1);
// When devices are detected, this event will be triggered
if ( meta_event->subevent == EVT_LE_ADVERTISING_REPORT ) {
uint8_t reports_count = meta_event->data[0];
void * offset = meta_event->data + 1;
while ( reports_count-- ) {
adv_info = (le_advertising_info *)offset;
// TODO: do something with the adv_info here
}
}
}
}
</code></pre></div></div>
<h4 id="nordic-semiconductor">Nordic Semiconductor</h4>
<p>The Nordic chips in the nRF52x family are the most widely used Bluetooth chipsets for Internet of Things projects. Unlike the higher-level platforms above, this is just a raw Bluetooth chip that allows custom programming. The chips have an embedded ARM processor that can be programmed in C using the Nordic SDK. While it is not terribly hard to find a programmer to do this work, it is much harder than for the options above. Engineers that write this kind of software rarely speak English, even those who claim it is their native tongue. Expect to deal with some serious neckbeard types here.</p>
<p>While you can buy battery-powered Nordic development kits as circuit boards the size of a deck of cards, these are not cheap and almost never are used in production. Nordic chips are almost always used on custom hardware, meaning you must design and build your own printed circuit board with all the other computer components (including connectivity to offload data) you need added on. You’ll have to build your own power supply and provide your own battery, too. This requires serious hardware engineering effort, a fairly large up-front R&D cost, and a contract manufacturer. But once it is done, the per-unit costs can be quite low.</p>
<p>Programming with the Nordic SDK is notoriously difficult due to sparse documentation that consists of a few sample applications, terse API docs, and Q&A forums filled with posts from frustrated developers. Most samples deal with using Nordic chips as a peripheral (meaning it advertises itself to be scanned by a phone or a laptop.). But it is also quote possible to program a Nordic chip to do BLE scanning. Like this (using C):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>NRF_BLE_SCAN_DEF(scan);
...
ble_gap_scan_params_t scan_param = {
.active = 0,
.interval = 4000,
.window = 4000,
.timeout = BLE_GAP_SCAN_TIMEOUT_UNLIMITED,
.scan_phys = BLE_GAP_PHY_1MBPS,
.filter_policy = BLE_GAP_SCAN_FP_ACCEPT_ALL,
.extended = 0,
.report_incomplete_evts = 0,
.channel_mask = {0x00, 0x00, 0x00, 0x00, 0x00}
};
nrf_ble_scan_params_set(&scan, &scan_param);
ret_code_t err_code = nrf_ble_scan_start(&m_scan);
// When devices are detected, this function will be called:
static void ble_evt_handler(ble_evt_t const *p_ble_evt, void *p_context) {
ble_gap_evt_t * p_gap_evt = &p_ble_evt->evt.gap_evt;
switch (p_ble_evt->header.evt_id) {
const ble_gap_evt_adv_report_t *p_adv_report = &p_gap_evt->params.adv_report;
const ble_data_t *adv_data = &p_gap_evt->params.adv_report.data;
// TODO: do something with the adv_data here
break;
}
}
</code></pre></div></div>
<h4 id="esp32">ESP32</h4>
<p>The ESP32 is the new kid on the block and an attractive alternative to Nordic for an embedded system. It is a family of chips designed by Espressif systems based on the Xtensa microprocessor, and the most popular variants come with both Bluetooth LE and WiFi on board, including integrated antennas. While designing a custom PCB based on the ESP32 still requires hardware engineering and custom manufacturing, for projects requiring WiFi connectivity the onboard WiFi radio simplifies this process. In addition, the development kits are relatively cheap with a small form factor suitable for production in some situations. These development kits require USB power from a wired source or a battery.</p>
<p>Finding a programmer may not be easier than with Nordic, but the platform’s popularity with hobbyists means you may be able to find somebody younger and cheaper to do the work than the crusty neckbeards who dominate Nordic development.</p>
<p>Starting a bluetooth scan is relatively simple. Like this (using C):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>static esp_ble_scan_params_t ble_scan_params = {
.scan_type = BLE_SCAN_TYPE_ACTIVE,
.own_addr_type = BLE_ADDR_TYPE_PUBLIC,
.scan_filter_policy = BLE_SCAN_FILTER_ALLOW_ALL,
.scan_interval = 4000,
.scan_window = 4000,
.scan_duplicate = BLE_SCAN_DUPLICATE_DISABLE
};
esp_err_t scan_ret = esp_ble_gap_set_scan_params(&ble_scan_params);
// When devices are detected, this function will be called:
static void esp_gap_cb(esp_gap_ble_cb_event_t event, esp_ble_gap_cb_param_t *param) {
switch (event) {
case ESP_GAP_BLE_SCAN_RESULT_EVT: {
esp_ble_gap_cb_param_t *scan_result = (esp_ble_gap_cb_param_t *)param;
switch (scan_result->scan_rst.search_evt) {
case ESP_GAP_SEARCH_INQ_RES_EVT:
uint8_t *adv_data = scan_result->scan_rst.ble_adv;
int adv_length = scan_result->scan_rst.adv_data_len;
// TODO: do something with the adv_data here
break;
}
break;
}
}
}
</code></pre></div></div>
<h4 id="bluenrg">BlueNRG</h4>
<p>Like Nordic and ESP32 this is a programmable SoC that requires you to build your own PCB with supporting hardware. This chipset is provided by ST Microelectronics, and is much less common. It’s lack of popularity is likely driven by its limited flexibility – it provides little control over scans. This chip has all the disadvantages of the Nordic and ESP32 competitors, but is much less common. You probably won’t choose to use this chip – but you may find yourself doing so because somebody else has made that choice for you. You are unlikely to find a programmer who has used this before, so find somebody with embedded programming experience and pay for him or her to learn how to use it.</p>
<p>The platform lets you start a scan like this (using C):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>uint8_t ret = aci_gap_start_observation_proc(
0x4000, /* Scan window */
0x4000, /* Scan interval *.
0, /* Do a passive scan */
0, /* Use public address */
1, /* Ignore duplciates */
0 /* Don't filter scan results */
);
// When devices are detected, this function will be called:
void hci_le_advertising_report_event(
uint8_t advertisement_count,
Advertising_Report_t *advertisements)
{
for (int i = 0; i < report_count; i++) {
Advertising_Report_t advertisement = advertisements[i];
// TODO: Do something with the advertisement here
}
}
</code></pre></div></div>
<h3 id="got-the-power">Got the Power?</h3>
<p>When it comes to Bluetooth scanning, any of the above platforms can get the job done.</p>
<p>But that scanning comes with a cost – the battery. Scanning is typically 10-100 times more energy intensive than advertising, mostly because the radio needs to be turned on only for tiny intervals in order to advertise, but for long periods of time to scan. Powering up the bluetooth radio and leaving it on to scan will noticeably drain the relatively large battery your phone. For a small embedded device, scanning will suck the life out of a coin cell before a day is out. If you want to scan, be prepared to provide some real power: a large battery, a wired supply, or regular recharging via a manual procedure or a small solar array.</p>
<p>There are many options to deal with this, even if you have to rely on batteries. You can save power by limiting scanning to specific times that are important. You can convince users of your system to put your device on the charger. And you can squeeze as much life out of your device as possible by fitting it with as big of a battery as you possibly can.</p>
<p>If you can work with one of the platforms above, and can find a way to deliver the power, the possibilities will be as limitless as the creativity you bring to the table.</p>
Nag OS2021-08-07T00:00:00+00:00http://davidgyoungtech.com/2021/08/07/nag-os<h1 id="new-beacon-permissions-in-android-12">New Beacon Permissions in Android 12</h1>
<p>There’s a new Android version on the horizon, and with it comes a lot more nagging.</p>
<p>The Android 12 beta <a href="https://developer.android.com/about/versions/12/features/bluetooth-permissions">includes a new permission</a> called BLUETOOTH_SCAN. Any app that wants to detect bluetooth beacons must ask the user for permission when the app runs. This new permission is <strong>in addition</strong> to the existing ACCESS_FINE_LOCATION permission and (if an app needs to detect beacons from the background) ACCESS_BACKGROUND_LOCATION, each of which <strong>also</strong> requires asking the user permission when the app is running.</p>
<p>The new permission appears to be designed to apps that use bluetooth for connectivity to not have to obtain location permission – they can obtain the new BLUETOOTH_SCAN permission instead of ACCESS_FINE_LOCATION, provided they do not infer location from the beacons. They may also need BLUETOOTH_CONNECT and BLUETOOTH_ADVERTISE if they want to connect to the other bluetooth devices or advertise themselves.</p>
<p>But apps needing to detect bluetooth beacons can’t switch to just the BLUETOOTH_SCAN permission because the Android team says it will filter out bluetooth beacons from the scan results if ACCESS_FINE_LOCATION is not also granted. For apps detecting beacons, yet another permission is now required.</p>
<p>Those who care about privacy might see this as a good thing. It’s always good to ask permission. But is it? Always?</p>
<p>Asking permission is generally a good thing, but asking has costs. Nobody wants to be badgered over and over for subtle variations in the same permission. “Mom, can I go outside?” “Mom, can I go in the back yards?” “Mom, can I play ball in the back yard?”</p>
<p>With Android it’s much worse, because the requests are much harder to understand. Who wants to read a paragraph explaining why a simple app feature requires an arcane Android permission? Who wants to read three paragraphs explaining why a single app feature requires three different arcane Android permissions? Installing a new app shouldn’t be like reading a contract from a lawyer.</p>
<p>The problem <a href="2019/10/18/permission-denied.html">has been growing for years</a>. Unfortunately, obtaining Android permissions has become such an onerous process that a significant part of an app’s codebase must be devoted to obtaining permissions. Many apps have resorted to an onboarding screen with a grid of all the permissions that must be obtained, color coded by which are required, which are optional, which have been granted, and which has been denied. Even users who want to grant permissions sometimes fail due to accidental screen taps, blocking further prompts. In the worst case, they find it impossible to hunt through Android’s permissions settings to find the missing permission that must be granted, abandoning the app entirely.</p>
<p>App developers who want to avoid this new permission can do so – for now. Apps not targeting Android 12 can continue working under the old permissions structure, even when running on phones running newer Android versions. But there are limits. As of August 2021, the Google Play Store requires all apps to target at least Android 10 for new submissions. Assuming Android bumps this requirement by one version per year, by two years from now all beacon apps will need to ask users for the new BLUETOOTH_SCAN permission.</p>
<p>Here’s a summary of how Android’s permission scheme relating to bluetooth beacons has changed over the years.</p>
<h4 id="android-45-2014-2015">Android 4/5 (2014-2015)</h4>
<ul>
<li>BLUETOOTH (install time)</li>
<li>BLUETOOTH_ADMIN (install time)</li>
</ul>
<h4 id="android-6-9-2016-2018">Android 6-9 (2016-2018)</h4>
<ul>
<li>BLUETOOTH (install time)</li>
<li>BLUETOOTH_ADMIN (install time)</li>
<li>ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION (runtime)</li>
</ul>
<h4 id="android-10-11-2019-2020">Android 10-11 (2019-2020)</h4>
<ul>
<li>BLUETOOTH (install time)</li>
<li>BLUETOOTH_ADMIN (install time)</li>
<li>ACCESS_FINE_LOCATION (runtime)</li>
<li>ACCESS_BACKGROUND_LOCATION (runtime – with options for one time, while using, always)</li>
</ul>
<h4 id="android-12-2021">Android 12 (2021)</h4>
<ul>
<li>BLUETOOTH (install time)</li>
<li>BLUETOOTH_ADMIN (install time)</li>
<li>BLUETOOTH_SCAN (runtime)</li>
<li>ACCESS_FINE_LOCATION (runtime)</li>
<li>ACCESS_BACKGROUND_LOCATION (runtime – with options for one time, while using, always)</li>
</ul>
How Far Can You Go?2020-05-15T00:00:00+00:00http://davidgyoungtech.com/2020/05/15/how-far-can-you-go<h1 id="the-challenge-of-bluetooth-distance-estimation">The Challenge of Bluetooth Distance Estimation</h1>
<p>Estimating distance with Bluetooth has long been a source of befuddlement and controversy. Developers often have trouble making it work and there is debate about whether it works well enough to be useful. That debate has been rekindled by its use in contact tracing apps designed to fight infectious disease.</p>
<h2 id="bluetooth-distance-estimates-101">Bluetooth Distance Estimates 101</h2>
<p>The idea is simple. A radio transmitter sends out a signal to a receiver. The receiver measures the strength of the signal. When the transmitter is close, the signal is strong. When the transmitter is far, the signal is weak.</p>
<p><img src="/images/signal-vs-distance.png" alt="Signal vs. Distance Graph" width="320" style="float: left; margin: 10px; " /></p>
<p>Reversing this logic, if we see a weak signal we might assume the transmitter is far and a strong signal that the transmitter is near. But what signal level indicates near and what signal indicates far?</p>
<p>The answer to that question depends on the strength of the transmitter and how good the receiver is at picking it up. The solution to this problem is to measure the signal at a known distance. A distance of one meter is typically used because it is close enough for the signal to be strong but far enough to avoid “near field” effects that can make radio signals unpredictable.</p>
<p>So a simple distance estimation might work like this:. If the known measured signal level at 1 meter is -50dBm, and we see a slightly weaker signal, then that means the estimated distance is just a bit further than one meter. To aid in this calculation, a Bluetooth transmitter can actually send a data packet containing this “measured power” or “tx power” value. That way the receiver knows the proper reference point for different transmitters with different power levels.</p>
<h2 id="the-math">The Math</h2>
<p>Converting this information to a specific distance is theoretically possible because the energy in radio waves decreases exponentially with distance. A graph shows a curve like above.</p>
<p>The units are decibels relative to one milliiwatt (dBm), and values are negative with more negative signals indicating a weaker signal. This is a logarithmic unit so a decline in 10 dB of signal level (e.g. from -50 dBm to -60 dBm) means that the power has declined by a factor of 10 – only 10% of the signal power remains.</p>
<p>The graph above can be explained by theoretical physics using this equation: <img src="/images/formula.png" alt="Distance Formula" width="100px" style="display: inline-block;" /> where p is the measured power at 1 meter, s is the signal strength and n is a constant that describes how easily the signal passes through the air.</p>
<p>While that equation works, it is typically not as good at estimating distance as power functions using a curve fitting technique. The reason the equation offered by physics doesn’t work quite as well in practice is because there are other factors going on that involve more than radio signal theory (more on that below.) Here is an equation with a similarly shaped curve derived experimentally using a Nexus 4 as a Bluetooth receiver. This is Java code:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>protected static double calculateDistance(int measuredPower, double rssi) {
if (rssi == 0) {
return -1.0; // if we cannot determine distance, return -1.
}
double ratio = rssi*1.0/measuredPower;
if (ratio < 1.0) {
return Math.pow(ratio,10);
}
else {
double distance = (0.89976)*Math.pow(ratio,7.7095) + 0.111;
return distance;
}
}
</code></pre></div></div>
<p>The above equation estimates distance using the measured power at one meter (received from the transmitter via a field in a transmitted data packet) and the signal strength measured by the Bluetooth chip. The signal strength is measured on a phone using a field called the Received Signal Strength Indicator with units in dBm. (More on that later.)</p>
<h2 id="does-this-work">Does this work?</h2>
<p>Roughly, yes. And short instances of a few meters the above equation can give you an approximate distance estimate. The closer to the one mater reference point you are, the more accurate the estimate might be. Under ideal conditions (clear line of sight, no reflective surfaces nearby) you might get an estimate of 1m +/- 0.5 meters when you are really one meter away. </p>
<p>Do not expect this to be able to tell you the difference between being 1.1 or 1.2 meters away. It’s just not that accurate.</p>
<p>The further you go away, the less accurate the distance estimates get. At a real distance of 2 meters, the calculated distance estimate might be 2 +/- 1 meter. Note that the margin of error has doubled. The same thing happens the further you get away – the margin of error keeps getting bigger. Eventually, when you are over 10 meters away, the distance estimate has very little accuracy. It can tell you that you are far away (e.g. > 8 meters, but it can’t tell you how far.)</p>
<p><img src="/images/estimate-vs-actual.png" alt="Estimated vs. Actual Graph" width="320" style="float: left; margin: 10px; " /></p>
<p>The reason this happens is that the signal power declines exponentially the further you get away. Bluetooth radios are weak transmitters to begin with. Maximum range is typically around 100 meters. But before you get anywhere near that distance, the signal is so weak that it is barely detectable. At a range of 30 meters, the signal level typically declines to about -100 dBm. This is typically the weakest signal level that a receiver can hear. At 40 meters, the receiver will fail to receive most packets as the signal is so weak it is indiscernible from background radio noise. The packets it does receive have the same measured signal level of about -100 dBm as seen at 30 meters – the receiver just gets fewer of them. Those packets it did get were lucky to be heard. </p>
<p>Distance estimates simply cannot tell the difference between a transmitter 30 meters away and 40 meters away because the signal level looks nearly identical to the receiver. It is similarly weak in both cases.</p>
<h2 id="is-this-even-useful">Is This Even Useful?</h2>
<p>Absolutely! If what you care about is telling if a transmitter is nearby (within a couple of meters), this works quite well. Also, if you care which of two transmitters is closer at short range, Bluetooth Distance estimates can also work very well. Such rough distance information is extremely valuable for many purposes.</p>
<p>But it all depends on your use case. If you need very accurate distance info (if it is important to distinguish 1.5 meters from 1.8 meters) this will never work for you. Likewise, if you need to measure specific distances at larger ranges (10 meters or more) this will fail.</p>
<p>Set your expectations realistically.</p>
<h2 id="usage-for-contact-tracing">Usage for Contact Tracing</h2>
<p>Contact Tracing apps often seek to determine how close two people came to assess infection risk. Did they come within 2 meters? </p>
<p>Some folks poo poo Bluetooth distancing for this purpose because it is not accurate enough to tell if two people are within 2 meters. It will lead to lots of false positives and negatives.</p>
<p>This criticism is simply wrong. Yes, Bluetooth distancing can give you a very good indication that two people came about 2 meters away from eachother. The distance estimate won’t be exact. But the 2 meter criterion is rather arbitrary anyway. Is there really zero risk of infection at 2.1 meters and an infinitely higher risk at 1.9 meters? Of course not! For the purpose of a contact tracing tool, what is important is to know if two people came “pretty close”. Yes, Bluetooth distance estimates can answer that question. They are not perfect. But no measurement with any tool is perfect.</p>
<p>For the above to work, there are many complicating factors that should be realized, some of which must be factored in to the distance estimates. More on those complications below. </p>
<h2 id="going-beyond-one-dimension">Going Beyond One Dimension</h2>
<p>The technique described so far, based on a single signal measurement, only works in one dimension. While you might be able to estimate the distance between a receiver and transmitter, you don’t know where they are in two-dimensional space. You can’t plot the receiver’s position within a room on an X/Y grid based on one measurement. A popular math exercise is to use multiple transmitters and “trilaterate” the X/Y coordinates of the receiver. While this is a fun exercise, it is rarely practical because the margin of error in the distance estimate gets too large once you are more than a couple of meters away. For this to work reliably in practice, you need a really small room.</p>
<p>An alternative technology called RSSI fingerprinting uses completely different techniques to locate a radio device in two-dimensional space.</p>
<h2 id="misusing-bluetooth-distance-estimates">Misusing Bluetooth Distance Estimates</h2>
<p>There are many cases of people misusing this technology in ways that gives it a bad name. The United Nations recently released a an app called <a href="https://play.google.com/store/apps/details?id=app.onepointfive">One Point Five</a> that notifies you every time it estimated that another Bluetooth device came within 1.5 meters to warn of infection risk. Critics panned the app for high false positive and false negative rates.</p>
<p>The implementation is horrendous. The app looks at every Bluetooth signal regardless of the device type, does not use a reference power value to do the distance estimate, and does nothing to correct for receiver efficiency variations.</p>
<p>As a result, the app goes off all the time at all kinds of distances while people walk by Bluetooth-enabled parking meters or garage door openers. And the app often never alerts on close contacts with other people’s phones, because those phones happen not to be emitting anything over Bluetooth. This happens mainly because the app lets you to get notifications based proximity to other phones without the app – a promise on which it can never ever hope to deliver.</p>
<p>The design for this project was terribly flawed from the start and it never should have been released. But it is a huge mistake to dismiss a technology because some crappy apps misuse it. Just because some people don’t know how to use a tape measurer to properly cut a piece of wood doesn’t mean the rest of us should give up on tape measurers.</p>
<h2 id="complicating-factors">Complicating Factors</h2>
<p>There are lots of factors that complicate Bluetooth distance estimates. Each one might merit a blog post on its own, so we can only introduce them here:</p>
<h3 id="phone-hardware-and-bluetooth-stack-variations">Phone Hardware and Bluetooth Stack Variations</h3>
<ul>
<li><strong>Non-Spherical Antenna Patterns</strong>: antennas are not equally good at transmitting or receiving signals in every direction. Every phone model has a different antenna and the signal will be weaker in some directions. When graphed, this typically looks like a heart-shape. On one side there is a pronounced recession with a weaker transmission/reception. Depending on how you point your phone, signal levels will change.</li>
<li><strong>RSSI Sensor</strong>: the signal measurement is done by a Bluetooth chip that typically uses an 8-bit analog to digital converter to get a value of 0 to 255, where 255 is the strongest signal. There is code in the Bluetooth stack to convert these values to a value in dBm. How good this measurement and conversion actually represents a true value in dBm depends on all kinds of engineering factors. How much care did the phone designers take here? Did they simply copy the conversion from a completely different phone model without adjusting it for new hardware differences? Often times, particularly on cheap Android phones, the answer to the last question is certainly yes.</li>
<li><strong>Bluetooth Channel Rotation</strong>: Bluetooth LE uses three different advertising channels, each of which uses a slightly different radio frequency. When scanning for advertisements, each phone model rotates between these frequencies at a different rate, ranging from a fraction of a second (iPhone) to 10 seconds (Samsung). Because the antenna is tuned for peak detection at a single frequency, the measured RSSI will vary by a few dB depending on which channel it is on. This can have a big effect on distance estimates, and there is no way to read the channel you are on, so the only way around this is to sample data over longer periods of time than the worst-case rotation rate across all channels (e.g. 30 seconds).</li>
<li><strong>Phone Case</strong>: A phone’s case (especially if metal) and component placement will affect the signal strength and antenna pattern. A user-installed phone case can do the same, although the more typical ones made of polymers typically have a minor effect. But you never know when a teenager has metal “bling” embedded in the case right by that Bluetooth antenna.</li>
</ul>
<h3 id="environmental-factors">Environmental Factors</h3>
<ul>
<li>
<p><strong>Radio Noise</strong>: Bluetooth LE uses the same radio spectrum as WiFi, Zigbee and other consumer electronics technologies and this space is often congested. Other radio signals at different frequencies often bleed into this space as well. I live across the street from a military base with powerful antennas that make packet reception rates much lower in my house than when I am in other locations. Radio noise can both prevent packets from being received and affect the RSSI measurement itself. The further you are away (the weaker the signal) the more effect radio noise has on your measurements. Once Bluetooth LE signals drop below about -100 dBm, they reach what is called the “noise floor”. This means the receiver can no longer distinguish the 1s and 0s being transmitted from background noise.</p>
</li>
<li>
<p><strong>Obstructions</strong>: Walls, furniture, shelving, plants, he human body and other objects all absorb or reflect radio energy to some degree (more than air, anyway). In general, plastic and wood tend to be fairly transparent to Bluetooth frequencies. Metal is much less so. Whenever there are obstructions between the transmitter and the receiver, the signal level will go down relative to what it would have been with a clear line of sight. How much so depends on the materials in the obstructions and if the radio waves can find a way around them.</p>
</li>
<li>
<p><strong>Reflections</strong>: All objects, but particularly metal ones tend to reflect radio waves. The effect is not as simple as visible light in a mirror, but the result is analogous. This can amplify signals if extra radio energy arrives at the receiver due to reflections.</p>
</li>
</ul>
<h3 id="human-factors">Human Factors</h3>
<ul>
<li><strong>Human body</strong>: The human body is an obstruction – a big bag of conductive salt water. A crowd will absolutely attenuate signals. A dense crowd more so.</li>
<li><strong>Pocket and Purse</strong>: If a person has a phone in a pocket then the radio signal on the opposite side of the body will be weaker than on the side where the phone is not obscured by the body. A purse has more complex effects depending on the materials and other contents of the purse. Since people tend to move around and take their phones out of their pocket or purse, this effect usually changes over time. Depending on your use case, the fact that people move their phones around can be either a blessing or a curse.</li>
</ul>
<h2 id="the-curse-of-fragmentation">The Curse of Fragmentation</h2>
<p>When Apple unveiled the Bluetooth LE iBeacon format back in June 2013, Bluetooth distancing was put on the map. Since then an explosion of phone models has complicated the situation. This makes it impossible to get accurate absolute distance values from Bluetooth distance estimations unless you correct for device variations.</p>
<h3 id="transmitter-variations">Transmitter Variations</h3>
<p>Apple’s plan for handling hardware transmitter variations was called calibration. An iPhone app would be used to collect signal levels when the phone was held precisely 1 meter away. This procedure would give you an averaged value over, say, 30 seconds. The reference RSSI at 1 meter could then be programmed into the beacon transmission as a one byte value.</p>
<p>A beacon with a strong transmitter might be detected at -55 dBm at 1 meter. One with a weak transmitter might be detected at -65 dBm at 1 meter. But because this expected 1 meter value is sent in the transmission, the receiver can adjust accordingly.</p>
<h3 id="receiver-variations">Receiver Variations</h3>
<p>The calibration plan above worked great for a short time, because in June 2013 there were exactly two models of phones that could detect an iBeacon – the iPhone 4S and the iPhone 5. When it came time to calibrate your beacon, you used one of these two phones to do the reference measurement. As chance would have it, both phones measured about the same signal level when seeing the same beacon.</p>
<p>Since the only phones that could detect iBeacon signals were also the iPhone 4S and 5 this system worked pretty well.</p>
<p>But the very next month, in July 2013, this perfect world started to crumble. Google released Android 4.3 making the Nexus 4 the first Android phone to support Bluetooth LE and beacon detection. Over the next 7 years, 18 additional iPhone models would come out along with thousands of Android models. Every single one of these devices has differences in the Bluetooth chip, radio circuitry, antenna and case that affect the signal level the phone receives.</p>
<p>This makes it impossible for a beacon calibration performed on one phone to apply to a different phone.</p>
<p>There have long been dreams for a public database of device-specific adjustment factors so a correction could be applied. The open source Android Beacon Library even includes a mechanism to download such a database as it is updated after new Android models are released. But the difficulty of measuring the model-specific differences, a dearth of volunteer submissions, and a staggering number of Android models, all prevented this system from becoming a success.</p>
<p>One report from way back in 2015 counted <a href="https://www.zdnet.com/article/android-fragmentation-there-are-now-24000-devices-from-1300-brands/">24,000 Android models from 1300 brands</a>. The numbers have only gone up from there.</p>
<p>While Apple devices tend to deviate less from the old iPhone 4S and 5 reference signal levels, there are significant variations, especially on iPad models. Android phones, however, are all over the map. While many are excellent Bluetooth LE receivers, some are horrible. I have a Huawei P9 Lite that can’t detect a typical beacon more than 20 feet away. Its detected signal level at short range is often 15 dB weaker than iPhone models.</p>
<h3 id="solving-for-fragmentation">Solving for Fragmentation</h3>
<p>In order to solve the problem of fragmentation on distance estimates, we need a good database of phone models vs. transmitter strength and receiver efficiency. Such a database might indicate that for an Android Nokia 3800GS with <a href="https://en.wikipedia.org/wiki/Type_Allocation_Code">Type Allocation Code</a> 1234, the transmitter power deviates +2 dBm from a reference value and the receiver efficiency deviates -1 dBm from a reference value. (This is just an example – that’s not even a real phone.) Taking the time to properly measure each device might take 30 minutes.</p>
<p>There is no way that such a database can ever hope to cover every phone model out there. But they can cover the most common ones. Singapore’s Trace Together team, for example, <a href="https://raw.githubusercontent.com/opentrace-community/opentrace-calibration/master/src/images/raw_rssi_chart.png">took these kinds of measurements</a> using an antenna chamber for a few of the most popular phones in the Singapore market. Unfortunately, they did not capture separate transmitter power and receiver sensitivity measurements.</p>
<p>The fact that Google and Apple’s Exposure Notification Service includes a transmitter power byte in the the encrypted metadata gives some hope that the Silicon Valley giants plan to work on such a database. With only ~20 iPhone models supporting BLE, clearly Apple has a much easier job than Google. But as a nearly trillion dollar company, Google certainly has the resources to do measurements on at least a few hundred of the most popular devices in the market today, and support a team that updates this database in the future. Will they do this? If so, will they make the data public?</p>
<h3 id="a-ray-of-hope">A Ray of Hope?</h3>
<p>Without a doubt, the device fragmentation described above is the biggest obstacle to reliable Bluetooth distance estimation for many use cases.</p>
<p>For all the trouble that the coronavirus pandemic has brought upon the world, it has had the effect of energizing the tech community around solving some of these technical problems. Will the current pandemic inspire creation of a public database of device specific that can be used to solve the problem of fragmentation and distance estimates?</p>
<p>This is far from certain, but the current situation offers the best hope of a solution that we’ve had in many years.</p>
Hacking The Overflow Area2020-05-07T00:00:00+00:00http://davidgyoungtech.com/2020/05/07/hacking-the-overflow-area
<h1 id="the-secrets-of-ios-bluetooth-advertising-in-the-background">The Secrets of iOS Bluetooth Advertising in the Background</h1>
<p>Developers of Bluetooth apps on iOS have long struggled with the platform’s limited functionality
in the background. In particular, iOS apps can’t freely emit Bluetooth advertisements when they are not
in the foreground. On iOS, being in the foreground means that the app has to be visible on
the screen with the screen turned on. This limits full functionality to when people are actually interacting
with the app.</p>
<p>When an iOS app is in the foreground, it can emit an iBeacon advertisement and it can emit a GATT service UUID.
In the background, it can do none of these things. But backgrounded iOS apps are allowed to host Bluetooth
services. And because hosting a Bluetooth service means that you have to “advertise” that service to others,
Apple must do <em>something</em> to allow this. That somehow is called the “Overflow Area”. And understanding how
it works unlocks a world of possibilities for sending and receiving Bluetooth advertising data between iOS apps
in the background.</p>
<h2 id="overflow-area-101">Overflow Area 101</h2>
<p>Typically, when a Bluetooth LE peripheral advertises itself to others, it transmits a distinct Service UUID to
let others know it is there. A standard advertisement for a 128-bit service UUID consists of a packet type 0x07
followed by a 128-bits of the service. A service with UUID 00000000-0000-0000-0000-000000000039 has an
advertising packet that looks like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>07 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
</code></pre></div></div>
<p>Simple right?</p>
<p>This is a Bluetooth standard. An advertisement like that is supported on all platforms including iOS and Android.</p>
<p>Apple lets a foregrounded app emit an advertisement like above, but move it to the background and this no longer
works. Instead, the advertisement is moved to the “Overflow Area” so it looks like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>ff 4c 00 01 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 80
</code></pre></div></div>
<p>The above advertisement is no longer type 0x07 – it is now type 0xff, which is a manufacturer advertisement type. The
<code class="language-plaintext highlighter-rouge">4c 00</code> bytes correspond to Apple’s assigned manufacturer code of 0x004C by Bluetooth SIG. The next 17 bytes are
manufacturer data, and this type of packet can be used anyway the manufacturer wants. There is no standard here
once you get past the <code class="language-plaintext highlighter-rouge">4c 00</code>. A manufacturer advertisement can be used for anything the transmitter and receiver want.</p>
<p>But what do the rest of the bytes mean? How does this advertisement work? Here’s how Apple’s documentation describes it:</p>
<blockquote>
<p>Any service UUIDs contained in the value of the CBAdvertisementDataServiceUUIDsKey key that don’t fit in the allotted space go to a special “overflow” area. These services are discoverable only by an iOS device explicitly scanning for them.
While your app is in the background, the local name isn’t advertised and all service UUIDs are in the overflow area.</p>
</blockquote>
<p><a href="https://developer.apple.com/documentation/corebluetooth/cbperipheralmanager/1393252-startadvertising">Reference</a></p>
<p>That’s super vague. Typical Apple! Always hiding implementations in proprietary code.</p>
<p>To find out how this works, I had to reverse engineer it. I wrote an <a href="https://github.com/davidgyoung/BackgroundAdvertiser">iOS app</a> that sequentially
advertises service UUIDs starting with 00000000-0000-0000-0000-000000000000, then
00000000-0000-0000-0000-000000000001, etc. I then used an <a href="https://github.com/davidgyoung/AdvetiserAnalyzer">Android app</a> to detect these advertisements and print out the
patterns in the 17 data bytes of the overflow area as binary.</p>
<p>From this, I learned that the overflow area works this way:</p>
<ol>
<li>The first manufacturer data byte is always 0x01. This lets you know the manufacturer data is an “overflow area” advertisement</li>
<li>The next 16 bytes are a 128 bit bitmask. Each service UUID you advertise will cause exactly one of those 128 bits to be set to 1. In the example shown above, the very last bit in the bitmask is set to 1.</li>
<li>There is a one-to-one mapping between a service UUID and the bit position it sets in this bitmask. This is consistent across iOS devices. Converting a service UUID to a bit position in the bitmask is some proprietary Apple hashing algorithm.</li>
<li>Because there are a huge number of possible 128-bit UUIDs – 2^128 (about 10^38) – multiple service UUIDs share the same bit position.</li>
</ol>
<h2 id="how-ios-uses-the-overflow-area">How iOS Uses the Overflow Area</h2>
<p>When an iOS device is scanning for Bluetooth services, it can specify the service UUID it is looking for like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>let serviceUuid = CBUUID(string: "00000000-0000-0000-0000-000000000039")
centralManager?.scanForPeripherals(withServices: [serviceUuid],
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
</code></pre></div></div>
<p>That code will give a callback when a standard type 0x07 service advertisement for that UUID is seen:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral,
advertisementData: [String : Any], rssi RSSI: NSNumber) {
}
</code></pre></div></div>
<p>The same code will also give a callback if it sees an overflow advertisement with the corresponding bit in its bitmask set for
that service UUID <strong>but only if the screen is on, and beacon ranging is enabled.</strong></p>
<p>Yes, scanning the overflow area service advertisements only works if the screen is illuminated on the receiving device. It doesn’t
matter if the app is in the foreground (visible) or if it is in the background with another app or the springboard visible. The
phone doesn’t even need to be unlocked. If the screen is on, locked or not, a callback for the overflow advertisement will get delivered to an app scanning for it
repeatedly each time it is detected. Beacon ranging must also be enabled for this to work.</p>
<p>The above screen-on restriction is why full background to background Bluetooth data exchange are often considered impossible on iOS. While such an exchange is possible when both apps are in the background, it is only possible if the device receiving the advertisement has the screen on. Fortunately, there are tricks that can make the screen go on temporarily. (See note.)</p>
<blockquote>
<p>Note: With default configuration, an iPhone screen will turn on briefly each time a notificaton is received – every time the user gets an email or SMS message, for example. While this happens frequently enough on its own, a guranteed event can also be triggered from within an app by sending a local notification. So long as the iOS device is not in do not disturb mode, any notification will cause the screen to illuminate for 10 seconds, and overflow advertisements to get delivered during that time. During these intervals,
a backgrounded iOS app can discover services advertised from nearby backgrounded iOS apps.</p>
</blockquote>
<h2 id="how-ios-handles-collisions">How iOS Handles Collisions</h2>
<p>Since many service UUIDs share each bit position in the overflow area bitmask. What happens if an iOS app is scanning for a
service UUID encounters another backgrounded iOS app that is advertising a different service UUID that uses the same bit position?</p>
<p>The answer is that iOS will give a scanning callback for the colliding but different service UUID. This won’t happen often.
But programmers should realize that they may scan for their service only to get a callback for detecting a backgrounded iOS
device advertising a completely different service that just happens to collide in the overflow area’s bitmask.</p>
<h2 id="hacking-the-overflow-area">Hacking the Overflow Area</h2>
<p>Now that we know how the overflow area works, we can use it for all kinds of other things, <strong>including data exchange between backgrounded iOS apps.</strong></p>
<h3 id="using-overflow-area-on-other-platforms">Using Overflow Area on Other Platforms</h3>
<p>This is easy to do. Using the info described so far you can make an Android device (or other non iOS device) detect a service UUID of interest on a backgrounded iOS device. Just write code that looks for any overflow area advertisement (<code class="language-plaintext highlighter-rouge">ff 4c 00 01</code>), look for the bit position a known service uses and adjust your detection code to verify that bit is set in the bitmask. The scanning device can connect to the advertising device knowing it likely hosts the service of interest.</p>
<h3 id="detecting-any-backgrounded-ios-service-advertiser">Detecting Any Backgrounded iOS Service Advertiser</h3>
<p>iOS apps are not allowed to receive non-iBeacon advertisements in the background unless they specify the particular service UUIDs they are looking for. But knowing that there are only 128 bits in the overflow area bitmask, you can write an app that scans for every possible bit in the bitmask. This will give a callback for any iOS device advertising a service in the background regardless of service UUID. </p>
<h3 id="background-ios-data-exchange">Background iOS Data Exchange</h3>
<p>The overflow area can be manipulated programmatically to put any pattern you want (except all zeroes) into the 16 byte bitmask area. These data can then be transmitted from a backgrounded iOS app.
The data can be received by iOS, Android and other devices, even in the background. On iOS, the backgrounded reception does require that the screen be on. But again, this can be forced periodically to do a quick read by sending a local notification.
Another limitation is that the overflow area is shared between all apps on the phone. Any app that advertises a Bluetooth service in the background will set (usually one) bit in the bitmask. A second app on the same phone has no way of knowing this is happening. So while an app can guarantee a pattern of 1s is set in the bitmask, it cannot guarantee any 0s. In practice, it is rare for an iOS device to be running any backgrounded apps advertising Bluetooth. So while you will usually get all 0s in the bitmask in positions you do not set, this is not guaranteed.
Setting up data exchange is a bit tricky. The key is to generate a table of 128 different service UUIDs known to occupy a distinct position in the bitmask. Fortunately, I have already done that for you. See the code in my <a href="https://github.com/davidgyoung/BackgroundAdvertiser/blob/master/BackgroundAdvertiser/OverflowArea/OverflowAreaUtils.swift">OverflowAreaUtils class</a>.</p>
<p>Using this utility, an iOS receiver can scan for 128 different UUIDs, one for each position in the overflow area’s bitmask. </p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>centralManager?.scanForPeripherals(withServices: OverflowAreaUtils.allOverflowServiceUuids(),
options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
</code></pre></div></div>
<p>When it gets a callback, the callback will provide a list of all UUIDs it found. You can convert this list to a 128 bit number:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber) {
if (advertisementData[CBAdvertisementDataOverflowServiceUUIDsKey] != nil) {
if let overflowIds = advertisementData[CBAdvertisementDataOverflowServiceUUIDsKey] {
if let overflowIds = overflowIds as? [CBUUID] {
NSLog("Overflow Area bitmask as binary string: \(OverflowAreaUtils.overflowServiceUuidsToBinaryString(overflowUuids: overflowIds))")
}
}
}
}
</code></pre></div></div>
<p>On the transmission side, an iOS advertiser can generate a 128 bit number and then convert any set bits to a corresponding service UUID to be advertised:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>// Binary for the ASCII sequence: "OverflowAreaWoot"
let binaryString = "01001111011101100110010101110010011001100110110001101111011101110100000101110010011001010110000101010111011011110110111101110100"
let adData = [CBAdvertisementDataServiceUUIDsKey :
OverflowAreaUtils.binaryStringToOverflowServiceUuids(binaryString: binaryString)]
peripheralManager?.startAdvertising(adData)
</code></pre></div></div>
<p>The code shown above will transmit the 128-bit ASCII string, “OverflowAreaWoot” from a backgrounded iOS app to another backgrounded iOS app with the screen on.</p>
<p>Using code like above, you can effectively exchange anything you can fit in 128 bits of data between backgrounded iOS apps in a single packet. And because overflow advertisements are sent out at 1Hz, you can send more data by altering the advertisement in time. (<strong>NOTE: this has been blocked by iOS 14, see below.</strong>) You just have to make sure the receiving iOS device has the screen turned on to receive it. </p>
<p>Oh, and don’t forget, you cannot send 128 bits of zeroes. If you don’t advertise at least one service (for one bit position set to 1) no advertisement will go out.</p>
<h3 id="the-tragedy-of-the-commons">The Tragedy of the Commons</h3>
<p>As discussed above, the overflow area is a shared resource between all apps. Any app can set one or more bits. No app can know what bits other apps on the phone have set. </p>
<p>While the typical iOS device will have no backgrounded apps setting bits in the overflow area, this is not guaranteed. Apps that want to reliably exchange data using the techniques described above might consider logic to account for bit collisions with other apps. Given that most apps using the overflow area for its intended purpose will typically not set more than one bit, it is possible to periodically alter the overflow area data exchange to shift the transmission left or right in the bitmask to avoid collisions with a bit or two that are stuck in the on position by other apps.</p>
<p>But an app that manipulates the overflow area by setting multiple bits is effectively polluting a common resource. It is “overgrazing the commons” as British economist William Forster Lloyd described.</p>
<p>This is no big deal if only one app does it per phone.</p>
<p>However, if two apps on the same phone try to use the data exchange technique at the same time, both will fail. If you plan to use this, realize it will only work until some other app on the phone tries to do the same thing.</p>
<p><strong>UPDATE: May 28, 2020</strong>: This post has been updated to clarify that beacon ranging must also be enabled for overflow area advertisements to be discovered when the app is not in the foreground.</p>
<p><strong>UPDATE: September 7, 2020</strong>: A full reference application is available that shows how you can make two iOS devices exchange data in the background. <a href="https://github.com/davidgyoung/OverflowAreaBeaconRef">See here</a>.</p>
<p><strong>UPDATE: November 13, 2020</strong>: Apple has changed how this behaves as of iOS 14: Starting with that OS version, you cannot change the bluetooth services advertised by your app when it is in the background. Doing so has no effect on the overflow area advertisement until the app is brought to the foreground. So while you can set up an overflow area advert to work in the background as described in this blog post, that setup must happen while the app is in the foreground, then it will continue working with the app in the background. Unfortunately, this means you cannot change the overflow area advertisement while the app is in the background – you must get the user to bring the app back to the foreground to do so.**</p>
Hacking With Contact Tracing Beacons2020-04-24T00:00:00+00:00http://davidgyoungtech.com/2020/04/24/hacking-with-contact-tracing-beacons
<p><img src="/images/covid-transmitter.png" alt="BeaconScope Transmitter" width="320" style="float: right; margin: 10px; " />
<img src="/images/covid-receiver.png" alt="BeaconScope Receiver" width="320" style="float: right; margin: 10px; " /></p>
<p>When Google and Apple announced a common specification for pandemic contact tracing on April 10, it offered hope for a universal system. Currently, dozens of projects around the world are working on mutually incompatible systems, specifically targeting national populations, individual provinces or even employees of specific companies.</p>
<p>The big problem with proposals from Google and Apple is that they are so far just vaporware. Promised SDKs have not been published as of this writing, and cannot be used. (UPDATE 4/30/2020: Apple released SDKs in XCode 11.5 beta for running on iOS 13.5 beta 2.) What’s more, the latest version of iOS, 13.4.1 disallows transmitting the Exposure Notification Service beacon bluetooth advertising packet in the common specification. Apple will have to release a 13.5 version of iOS before this will work. When Apple’s SDK is released and delivered with in a future version of XCode, Apple is expected to block direct transmission and detection of this beacon format by third party apps, forcing them to use higher-level APIs. (UPDATE 4/30/2020: This expected blocking is in place as of iOS 13.5 beta 2.)</p>
<p>Android, however, is another story. While Google has likewise not released any new SDKs that support the proposed APIs (although a Google Play Services update is expected for this), Android already supports sending and detecting the Exposure Notification Service beacon advertisement that <a href="https://www.blog.google/documents/62/Exposure_Notification_-_Bluetooth_Specification_v1.1.pdf">the bluetooth specification</a> envisions.</p>
<h2 id="exposure-notification-service-beacon">Exposure Notification Service Beacon</h2>
<p>The common system relies on a bluetooth packet that will be sent out of both Android and iOS phones. The packet is a GATT service advertisement with attached data and looks like this:</p>
<hr />
<table class="table table-bordered">
<thead>
<tr>
<th style="text-align: center">length</th>
<th style="text-align: center">type</th>
<th style="text-align: center">UUID</th>
<th style="text-align: center">length</th>
<th style="text-align: center">type</th>
<th style="text-align: center">UUID</th>
<th style="text-align: center">rolling proximity identifier</th>
<th style="text-align: center">metadata</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center">0x03</td>
<td style="text-align: center">0x03</td>
<td style="text-align: center">0xfd6f</td>
<td style="text-align: center">0x17</td>
<td style="text-align: center">0x16</td>
<td style="text-align: center">0xfd6f</td>
<td style="text-align: center">16 bytes</td>
<td style="text-align: center">4 bytes</td>
</tr>
</tbody>
</table>
<hr />
<p>The 16-bit GATT service UUID, 0xfd6f identifies a transmission from the phone as an Exposure Notification Service advertisement (formerly known as the Contact Detection Service until a recent rebranding – do marketing folks think “detection” is a naughty word?). The 16 bytes of attached data is the identifier of the transmitting device – a “rolling proximity identifier” as the spec describes. An app transmitting this packet is supposed to change this identifier every 15 minutes based on a cryptographic algorithm. The final four bytes are “encrypted metadata” which include versioning information as well as a tx power value that indicates how strong the bluetooth signal might be at a known distance.
GATT service advertisements are typically used to advertise connectable Bluetooth LE GATT services – a little program that you can connect to over bluetooth, and exchange data. But i this case, there is no such service. The advertisement itself indicates it is not connectable. The entire purpose of the advertisement is to announce its presence and deliver this identifier. It is therefore a Bluetooth LE beacon advertisement, much like Google’s Eddystone family of Bluetooth beacon advertisements that also are based on GATT service advertisements.</p>
<h2 id="exposure-notification-beacons-in-action">Exposure Notification Beacons in Action</h2>
<p>Today, you can use the free and open-source Android Beacon Library to send and receive this beacon format. You can even try it out without writing any code by using my off-the-shelf <a href="https://play.google.com/store/apps/details?id=com.davidgyoungtech.beaconscanner&hl=en_US">BeaconScope</a> mobile app. This app is based on the same library, and can both send and receive this Exposure Notification Service beacon advertisement.</p>
<h2 id="making-your-own-app">Making Your Own App</h2>
<p>If you wan to write your own app to do this, then you need the 2.17 version of the Android Beacon Library. With that you can make a transmitter like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>String uuidString = "01020304-0506-0708-090a-0b0c0d0e0f10";
Beacon beacon = new Beacon.Builder()
.setId1(uuidString)
.build();
// This beacon layout is for the Exposure Notification service Bluetooth Spec
BeaconParser beaconParser = new BeaconParser()
.setBeaconLayout("s:0-1=fd6f,p:-:-59,i:2-17,d:18-21");
BeaconTransmitter beaconTransmitter = new
BeaconTransmitter(getApplicationContext(), beaconParser);
beaconTransmitter.startAdvertising(beacon
</code></pre></div></div>
<p>That layout string above is what tells the library how to understand this new beacon type. The layout “s:0-1=fd6f,p:-:-59,i:2-17,d:18-21” means that the advertisement is a gatt service type (“s:”) with a 16-bit service UUID of 0xfd6f (“0-1=fd6f”) and it has a single 16-byte identifier in byte positions 2-17 of the advertisement (“i:2-17”) and a 4-byte data field in positions 18-21 (“d:18-21”). The “p:-:-59” indicates that there is no unencrypted measured power calibration reference transmitted with this beacon, and the library should default to using a 1-meter reference of -59 dBm for its built-in distance estimates.
You can use similar code to detect these beacons like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>BeaconManager beaconManager = BeaconManager.getInstanceForApplication(this);
beaconManager.getBeaconParsers().add(new BeaconParser().setBeaconLayout("s:0-1=fd6f,p:-:-59,i:2-17,d:18-21"));
...
beaconManager.startRangingBeaconsInRegion(new Region("all exposure beacons", null));
...
@Override
public void didRangeBeaconsInRegion(Collection<Beacon> beacons, Region region) {
for (Beacon beacon: beacons) {
Log.i(TAG, "I see an Exposure Notification Service beacon with rolling proximity identifier "+beacon.getId1());
}
}
</code></pre></div></div>
<p>You can also make the above transmitter and receiver work indefinitely in the background by using a Foreground Service. The library documentation describes how to set this up <a href="https://altbeacon.github.io/android-beacon-library/foreground-service.html">here</a>.</p>
<p>What the above shows is just raw transmission and reception. It doesn’t show is how to handle the “rolling proximity identifiers” inside these beacons.</p>
<h2 id="how-the-identifiers-work">How the Identifiers Work</h2>
<p>Much of the proposal from Android and Apple is devoted to <a href="https://www.blog.google/documents/60/Exposure_Notification_-_Cryptography_Specification_v1.1.pdf">how to handle these identifiers</a> in a decentralized privacy-friendly way. The transmitted “rolling proximity identifier” is a 16-byte GUID. But each app transmitter is supposed to change its transmitted identifier approximately every 15 minutes. At any given time, a single device’s identifier is based on a temporary exposure key (which the phone generates and stores daily), and a cryptographic algorithm that makes a 16-byte “rolling proximity identifier” based on this temporary exposure key.</p>
<p>Because the identifiers appear to be randomly changing every 15 minutes, it is supposed to be impossible to tell which identifiers came from which device (or even which rotated identifiers came from the same device) – unless you have the device’s daily key.</p>
<p>If medical tests confirm that a device’s owner is infected with novel coronavirus, that owner can optionally publish his or her temporary exposure keys from the last few weeks. These temporary exposure keys allow all other mobile phones in this system to look for a match. The apps on these phones can then re-run the cryptographic algorithm with these published keys, allowing it to see if any of the “rolling proximity identifiers” match ones send out by the device owned by the infected user. If there is a match, the app knows the specific time when it saw the infected user, for how long, and how strong the bluetooth signal was at those times, giving an idea of how close the two people came.</p>
<h2 id="can-you-build-your-own-implementation">Can You Build Your Own Implementation?</h2>
<p>On Android, yes, you can roll your own implementation of this today. In addition to the transmitter and receiver code shown above, you will also need to create your own implementation of the key generation, identifier rotation, key sharing and matching algorithm.</p>
<p>There are lots of reasons you might want to do so:</p>
<ol>
<li>You want a test tool to see how this system works, or to see if nearby devices are using it.</li>
<li>You don’t want to wait for Google Play Services update with Google’s implementation, and want to make your own now.</li>
<li>You want to provide an implementation for Android users who will never get the Google Play Services update. (e.g. Phones sold in China, Amazon Fire Tablets, newer Huawei phones sold outside China.)</li>
<li>You want to build this into your own app so if users don’t update Google Play Services, or deny it permission to perform this function, your app can still provide the functionality.</li>
</ol>
<p>If you decide to proceed, be forewarned that the spec is not final, so it may be subject to change. In the past few days, the spec was updated to add the Encrypted Metadata field and change much of the terminology.</p>
<h2 id="multiple-installations-on-the-same-phone">Multiple Installations on the Same Phone</h2>
<p>If you do build your own implementation, and a user later installs both your version and that inside Google Play Services, both will work at the same time. To other devices, two implementations on a single phone will appear to be two different phones (although over short intervals they will share the same bluetooth MAC address so it is theoretically possible to know they are the same phone.) The consequences of two copies running on the phone are little different than carrying two phones in your pocket.</p>
<h2 id="hacking-on-ios">Hacking on iOS</h2>
<p>Equivalent hacking on iOS is currently impossible – at least on the transmission side. Apple’s iOS APIs prevent any 3rd party app from making the phone transmit the kind
of advertisement shown in the spec. While apps can transmit GATT service advertisements, they can’t attach data. This is because the <code class="language-plaintext highlighter-rouge">CBAdvertisementDataServiceDataKey</code> that associates service data to an advertisement is read-only on iOS. You simply can’t set the data needed to advertise one of these beacons.</p>
<p>What iOS can do is detect such advertisements with CoreBluetooth – for now at least. (UPDATE 4/30/2020: This no longer works on iOS 13.5 beta 2. It works on earlier OS versions.) Here’s code that shows how you can do that:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>let exposureNotificationServiceUuid = CBUUID(string: "FD6F")
centralManager?.scanForPeripherals(withServices: [exposureNotificationServiceUuid], options: [CBCentralManagerScanOptionAllowDuplicatesKey: true])
...
func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber) {
if let advDatas = advertisementData[CBAdvertisementDataServiceDataKey] as? NSDictionary {
if let advData = advDatas.object(forKey: CBUUID(string: "FD6F")) as? Data {
let hexString = advData.map { String(format: "%02hhx", $0) }.joined()
let proximityId = String(hexString.prefix(32))
let metadata = hexString.suffix(8)
NSLog("Discovered Exposure Notification Service Beacon with Proximity ID\(proximityId), metadata \(metadata) and RSSI \(RSSI)")
}
}
}
</code></pre></div></div>
<p>The code above shows you how to start a scan for a service advertisement of the proper FD6F type. When one is detected, it then pulls out the
service advertising data and splits it into the proximity id and metadata from the spec, and prints these out has hex bytes to the log.</p>
<p>There are a few caveats to this code:</p>
<ol>
<li>It will only receive constant updates when the app is in the foreground – meaning the device screen is unlocked, turned on and the app is visible.</li>
<li>In the background, the app will get at most one detection callback. That is because iOS ignores the <code class="language-plaintext highlighter-rouge">CBCentralManagerScanOptionAllowDuplicatesKey</code> when an app is not in the foreground, and only gives you the first detection. While this is good enough to build a scanning test tool on iOS, it makes it impossible for third party apps to develop background detectors.</li>
</ol>
<p>There is also some risk that a future iOS update will block the above code from working. An iOS update expected in May will make the operating system (but likely not 3rd party apps) be able to transmit the new beacon type. But it is also likely that iOS will update CoreBluetooth in this same release to filter out receiving these advertisements using code like shown above, so it no longer works. (UPDATE 4/30/2020: Indeed, this blocking is confirmed as of iOS 13.5 beta 2.) Apple did exactly that for iBeacon advertisements. CoreBluetooth APIs filter out any data bytes matching the iBeacon advertisement spec – the array of advertising data is truncated to zero bytes. Time will tell, but it is entirely likely they will do the same for this new beacon type.</p>
<h2 id="is-it-safe-to-hack-with-this">Is It Safe to Hack With This?</h2>
<p>One good thing about Apple and Google’s system is that it is resistant to interference. In general, you don’t need to worry about causing problems by building an Android app that transmits garbage identifiers. While other phones using the system will store your garbage identifiers, and they will take up a tiny amount of space on phones, they will never match a reported positive contact, so the consequences will be nil.</p>
<h2 id="hack-responsibly">Hack Responsibly</h2>
<p>While it is safe to experiment, with enough effort, it is possible to maliciously try to interfere with the system. You could, for example, build an app that listens for real identifiers from these beacons in one location, then send them over the internet to a different phone, then re-transmit them in another location. Such a “replay attack” would make the system believe that one person was in contact with people in a different location. In most cases, this wouldn’t really matter, but it could affect individual contact reports. And if done on a broad scale, it could make the system less reliable. Please don’t do that.</p>
Saving The World With Bluetooth2020-04-11T00:00:00+00:00http://davidgyoungtech.com/2020/04/11/saving-the-world-with-bluetooth<p>Can technology help save lives during the coronavirus pandemic? This exciting idea has inspired a number of urgent projects worldwide ranging from an open source ventilator to numerous contact tracing mobile apps.</p>
<p>The idea of contact tracing apps is simple: help people keep track of when they may have come in contact with others infected with novel coronavirus.</p>
<p>If one of the app users later tests positive for the novel coronavirus, the data collected by the app can be used to identify everybody who came in contact with the infected person, letting them know of the risk. A variety of apps released and in flight vary greatly by technologies used and how private data are managed, with the Singapore’s TraceTogether and MIT’s Safe Paths drawing a quick buzz. Not to be left out, Apple and Google announced they will work together to tie this functionality into their mobile platforms.</p>
<p>In this blog post, I evaluate the technical approaches used in the Singapore and MIT apps and analyze their potential effectiveness. I will then discuss proposals by Google and Apple may change the landscape.</p>
<h2 id="tracetogether-ios-and-android"><a href="https://tracetogether.zendesk.com/hc/en-sg">TraceTogether</a> (iOS and Android)</h2>
<p><strong>Developer:</strong> Singapore Ministry of Health</p>
<p><strong>Availability:</strong> Currently released in the iOS and App Stores</p>
<p><strong>Eligibility:</strong> Activation requires an SMS-capable phone with a +65 (Singapore) country code</p>
<h3 id="overview">Overview</h3>
<p><img src="/images/trace_together.png" alt="TraceTogether" width="320" style="float: right; margin: 10px; " /></p>
<p>The TraceTogether app works by silently finding other app users nearby, and recording when you came in contact with them. Looking at the user interface, you have no indication it is doing anything helpful – it just silently collects information. If you trust the app to do what it says, you might expect to get a call or text from the Ministry of Health if somebody you came near gets a positive coronavirus test result.</p>
<h3 id="technology">Technology</h3>
<p>The developers give a basic overview of how they say it works:</p>
<blockquote>
<p>TraceTogether uses Bluetooth to perform handshakes with other TraceTogether phones. Your Bluetooth-enabled phone is capable of connecting to multiple Bluetooth devices simultaneously, e.g. smart watch and wireless headphones. The different connections are separate and should not be affected by or affect TraceTogether… We use the Bluetooth Relative Signal Strength Indicator (RSSI) readings between devices across time to approximate the proximity and duration of an encounter between two users. This proximity and duration information is stored on one’s phone for 21 days on a rolling basis — anything beyond that would be deleted. No location data is collected.</p>
</blockquote>
<p>Since the Android and iOS apps require a mobile phone with a Singapore country code to activate, I couldn’t experiment with them myself. And because the source code was not open source as of this writing (despite the authors’ promises to release it) the easiest way to see how it works is to analyze the Android APK file. This analysis reveals a number of insights:</p>
<ol>
<li>The app advertises a Bluetooth LE GATT Service that exposes readable characteristics. It does not use Bluetooth beacons, and its advertisements are not beacons because they do not contain a unique identifier.</li>
<li>The app connects to each other device it sees that also hosts the app, reading a GATT characteristic to reveal a numeric identifier to identify the other user. This identifier is supposedly anonymized so only the Ministry of Health knows who it is.</li>
<li>The signal strength on the Bluetooth connection during the communication gives an indication of how far the other person was away – e.g. more or less than 6 feet.</li>
</ol>
<p>Adding a timestamp of when the user was detected, you end up with four pieces of information:</p>
<ol>
<li>Who the other user was</li>
<li>About how far away that user was</li>
<li>When the app came into contact with that user</li>
<li>How long the apps were in proximity</li>
</ol>
<p>By creating a log of this information, the app can (in theory) provide a pretty good record of who the user came into contact with.</p>
<p>That’s the theory. In reality, there are a number of flaws in its implementation that limit its usefulness. Let’s start with the fact that it doesn’t really work well on iOS:</p>
<blockquote>
<p>TraceTogether works best in the foreground. We recommend users keep TraceTogether open in meetings and crowded places. iOS users can also activate the in-app “Power Saver Mode” (see images) to keep TraceTogether in the foreground with a dimmed screen while communicating with other TraceTogether-enabled devices. <a href="https://tracetogether.zendesk.com/">TraceTogether Zendesk Site</a>, as posted April 4 2020</p>
</blockquote>
<p>Wow. Requiring users to keep the app visible in the foreground on their phone screens is a pretty big ask. Few people will actually do that.</p>
<p>Perhaps the development team cut corners on iOS because Apple only accounts for <a href="https://gs.statcounter.com/vendor-market-share/mobile/singapore/2019">35 percent of market share in Singapore</a>. Android, they say, does work in the background. Except the problem there is that Android is notoriously unreliable when it comes to establishing the kind of Bluetooth connections the app requires. As I have documented in a <a href="http://www.davidgyoungtech.com/2019/05/21/broken-connection">prior blog post</a>, the GATT service connection failure rate on Android devices is around 20 percent, even with retries.</p>
<p>So there is a 20 percent chance that an Android phone won’t be able to get the identity of other users in the vicinity even if they have the app. Likewise, other Android users have a 20 percent chance of not seeing your app. And that all assumes that everybody has the app installed properly, their batteries are charged, with Bluetooth on, location on, and proper permissions granted.</p>
<p>In practice, many of these things won’t be true, bringing the effectiveness rate probably well below 50 percent across both iOS and Android even for people who have it installed.</p>
<h3 id="data-collection">Data Collection</h3>
<p>As soon as you launch the app, it prompts you to enter a Singapore phone number so you can receive an activation code via SMS message. Entering this SMS message will register your phone number (and presumably the app’s “anonymous” installation identifier that is shared with other app users in the vicinity) with the Ministry of Health. Supposedly, all of the data are then stored exclusively on your phone and are not shared without your consent.</p>
<blockquote>
<p>All of this data is stored only on your phone, and not shared with MOH. Should MOH need the data for contact tracing, they will seek your consent to share it with them. <a href="https://tracetogether.zendesk.com/">TraceTogether Zendesk Site</a>, as posted April 4 2020</p>
</blockquote>
<p>In fact, you have no obvious way of accessing your own data as it is locked up inside your phone. Only the Singapore Ministry of Health can access these data, supposedly with your consent, perhaps in the case where you get a positive coronavirus test. This would allow health workers to de-anonymize the contacts inside your phone by linking them up with the phone numbers they collected, and enable them to alert them of their possible exposure.</p>
<h2 id="covid-safepaths-ios-and-android"><a href="https://github.com/tripleblindmarket/covid-safe-paths">Covid SafePaths</a> (iOS and Android)</h2>
<p><strong>Developer:</strong> MIT</p>
<p><strong>Availability:</strong> Currently released in the iOS and App Stores as “PrivateKit”</p>
<p><strong>Eligibility:</strong> Open</p>
<h3 id="overview-1">Overview</h3>
<p><img src="/images/safe_paths.jpg" alt="Covid SafePaths" width="320" style="float: right; margin: 10px; border:1px solid #000000;" /></p>
<p>This app relies entirely on tracking your location coordinates over time to remember where you have been over the past few weeks. Its current release doesn’t directly use Bluetooth or any other means to scan for other users in the vicinity. It is designed simply to record where you have gone so you can later know if you have crossed paths with somebody who has tested positive for the novel coronavirus.</p>
<p>Anybody who has ever used Google Timeline – a feature in Google Maps that tracks your location history – will be familiar with how this app works. You can see where you have been recently as measured by your phone’s location sensors.</p>
<p>The contact tracing comes in when you try to match your path with publicly published paths of people who have tested positive for the novel coronavirus. These published paths can be from independent sources or from data exports from the app itself. The app doesn’t directly let you publish your path, but allows you to export your path for trusted sources to publish, presumably after they have confirmed your infection status.</p>
<p>The app has a feature where you can import these public data sets for your area and check for overlaps in time and space to where you have gone. The idea here is that you can compare where you have been at the same time with other folks who have tested positive, giving you an idea of where you have been put at risk.</p>
<h3 id="technology-1">Technology</h3>
<p>Unlike Singapore’s Ministry of Health, the MIT initiative actually has released their code as open source, making it much easier to see what they are doing. The iOS and Android builds largely share a common codebase given that it is built on ReactNative. That is a real strength in the sense that it makes development faster, but it is quite limiting in the sense that it waters everything down to common functionality that both platforms can support in an equivalent way. That isn’t quite so important for location tracking, but it is important when you get into Bluetooth proximity.</p>
<p>And while the app doesn’t yet support detecting proximity to other users by Bluetooth, there are clearly plans to add this feature. A <a href="https://github.com/tripleblindmarket/covid-safe-paths/pull/193/files">recent change</a> added advertising of a custom Bluetooth beacon format that sends out a rotating universally unique identifier (UUID), and remembers what identifier the app was advertising at any given time. The rotation presumably is used as a privacy mechanism to prevent casual Bluetooth sniffers from tracking individuals using this rotating UUID, although rotation cannot protect against determined listeners.</p>
<p>Presumably, a future release will also add detection and tracking of these UUIDs, so you can tell who you came into contact with. But again, this has not been built yet. The idea may be to get the beacon transmitters out there now in case folks don’t upgrade the app later.</p>
<p>Aside from the fact that no Bluetooth detection has been built yet, there is another glaring hole in the app’s Bluetooth support. This Bluetooth advertisement is only enabled on Android. Why? Probably because iOS simply doesn’t let you advertise beacons when the app is in the background – you have to use other techniques to get around this. For these reasons it is unclear what this project hopes to accomplish with its Android-only advertisement, and whether it has any thoughts on how to deliver equivalent functionality on iOS.</p>
<p>But the bigger technical drawback is the entire path overlap approach. Location sensors on phones have quite poor accuracy in real world conditions. Sure if you have Google Maps running to give you driving directions, it’s pretty accurate. But that is because the GPS is fired up at a 100 percent duty cycle the whole time. An app simply can’t keep this going in the background without draining the user’s battery quickly. Folks who use driving directions on a long road trip know that they need to keep the phone charging to keep it from going dead.</p>
<p>The app is designed to save your battery by only passively using location sensors. When you aren’t using the GPS for other purposes like driving directions, it shuts off, falling back to secondary location sensors: WiFi hotspots, Bluetooth beacons or cell towers. Accuracy drops from a few meters down to tens or hundreds of meters. The app might know you were in a supermarket at a specific time of a specific day, but it has no idea where you were in that supermarket.</p>
<p>So, by itself, this kind of information isn’t very useful for contact tracing. Without the GPS being on (which it almost never is) the app can’t tell you if you were ever within, say, 6 feet of an infected person. While it might tell you if you were once in the same building at the same time as somebody else who tested positive, in any city with a significant infection rate, this condition would be true for a very large percentage of people. The bottom line is that the technology employed so far is way too prone to generating false positives.</p>
<h3 id="data-collection-1">Data Collection</h3>
<p>Where the MIT app shines is in privacy protection. Unlike Google Timeline, the MIT app doesn’t share your entire location history with the Silicon Valley behemoth. They don’t require registration with any servers, and they claim that your entire history is only stored locally on your phone. Only you can decide you want to share the data it collected, perhaps in the event that you test positive and want to help warn other folks by publishing where you’ve been.</p>
<p>But while this privacy protection is laudable, the fact that the app’s current technology does not allow effective contact tracing means that the MIT app is little more than an academic exercise in privacy protections in a location app.</p>
<h2 id="is-this-the-best-we-can-do">Is This the Best We Can Do?</h2>
<p>Both of these apps have gotten a lot of attention, but given the quality of what they do, this attention isn’t really deserved.</p>
<p>Singapore’s Ministry of Health got attention because they were the first to release an app. They got more attention by promising to release it as “open source” – something they still haven’t done three weeks later as of this writing. (<strong>EDIT:</strong> the code has now been posted <a href="https://github.com/opentrace-community">here</a>) The iOS version of the app expects users to leave their phone turned on all the time with the app visible on the screen for “best” results. Without a proper design that allows iOS Bluetooth tracking to work in the background, their app design is simply a non-starter.</p>
<p>The MIT app, meanwhile, has no working Bluetooth tracking at all, relying instead on inaccurate location measurements typically without aid of the battery-hungry GPS radio. This simply isn’t accurate enough to be a useful contact tracing tool.</p>
<p>There are other apps out there, too, that have been more solid in their technical approach. A Stanford University effort called <a href="https://www.covid-watch.org/">Covid Watch</a>, for example, tries to work around the distinct Bluetooth limitations of Android and iOS by building a system that uses the best of each platform’s capabilities, and uses tricks to make them cooperate when both kinds of devices are nearby. But even this approach still relies on unreliable GATT connections on Android.</p>
<p>Make no mistake: contact tracing apps can do better. It is possible for both Android and iOS apps to use Bluetooth to both advertise in the background and simultaneously scan for other devices doing the same. It is possible to measure with some degree of certainty if they are less than 6 feet apart. Apps can do this without requiring unreliable GATT connections on Android and without relying on background beacon broadcasts currently impossible on iOS. Such apps can record these identified devices in a privacy-friendly way, allowing a reliable way of later finding contacts that might have spread the disease.</p>
<h2 id="the-problem-of-adoption">The Problem of Adoption</h2>
<p>No matter how good the technology of an app, there must be common adoption for it to make a difference. If only 1 percent of people in a city install a particular contact tracing app, it simply won’t be useful. Even if 60 percent of people install apps, they still won’t be useful if they are fragmented between several different apps that don’t work together. The well-meaning efforts by dozens of app developers to build contact tracing efforts may be for naught.</p>
<h2 id="apple-and-googles-response">Apple and Google’s Response</h2>
<p>On April 10, Apple and Google issued a joint statement that they will produce APIs that work on both platforms for Bluetooth contact tracing. The common system plans to rely on a <a href="https://covid19-static.cdn-apple.com/applications/covid19/current/static/contact-tracing/pdf/ContactTracing-BluetoothSpecificationv1.1.pdf">custom Bluetooth beacon advertisement</a> with a rotating identifier, and always-on operation for users who opt-in. This system would eliminate many of the technical problems of the apps described above. The companies pledge to have an app available by sometime in May.</p>
<p><img src="/images/custom_beacon_format.png" alt="Custom Beacon Format" width="640" style="float: right; margin: 10px;" /></p>
<p>But here’s the elephant in the room: the proposal is impossible to implement on today’s iPhones installed with the latest iOS 13.4.1. The operating system has for years prohibited exactly the kind of Bluetooth advertisement mentioned in the proposal. That means Apple has to change their operating system for this to work. They then need to release the operating system (iOS 13.2?), get people to install it, and opt-in to the process. Given that over 90 percent of iPhone users typically upgrade to the latest operating system, this may eventually work out OK. But this will still take time, and not everybody will install the upgrade.</p>
<p>On Android, no new operating system changes are required. An app meeting the specifications can be built today (see how in the <a href="http://www.davidgyoungtech.com/2020/04/24/hacking-with-contact-tracing-beacons">next blog post</a>), although it may have trouble staying running in the background on cheaper phones models with custom battery savers. The biggest obstacle on Android is interoperability. Until Apple gets an operating system update out there, Android-only solutions will work poorly in places like America, where iPhones are popular, especially in coastal cities. A system that helps alert you to exposure from only half the people you encounter isn’t very useful.</p>
<h2 id="its-still-worth-trying">It’s Still Worth Trying</h2>
<p>No app can ever hope to provide a perfect solution. Bluetooth stacks sometimes go down. Phone batteries sometimes go dead. Even well-designed apps sometimes crash.</p>
<p>It’s not surprising that well-meaning teams, building as quickly as possible to get apps to the public, have not yet approached what’s achievable. Developers must temper their enthusiasm to build something in favor of solutions that can actually make a difference. But that shouldn’t stop us all from trying.</p>
Permission Denied2019-10-18T00:00:00+00:00http://davidgyoungtech.com/2019/10/18/permission-denied<h2 id="the-mobile-location-permission-crackdown">The Mobile Location Permission Crackdown</h2>
<p>Since beacon-enabled apps burst on the scene in 2013 both Android and iOS have been steadily cracking down on use of the technology in the background.
This has come in response to privacy concerns over apps that quietly use the technology to track users in the background.</p>
<p>This fall marks a big milestone in the way iOS and Android behave, with new restrictions showing up in the latest operating systems from
both platforms. For iOS the changes are big and potentially alarming from the end-user perspective.</p>
<h2 id="ios-13-changes">iOS 13 Changes</h2>
<p>Users who grant an app permission to always access location are periodically be presented with a warning dialog reminding them this is happening. iOS 13 ow adds a map showing the specific locations
where the app read their location.</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/used-location-in-background.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>New iOS Background Location Usage Warning</em></td>
</tr>
</tbody>
</table>
<p><br /></p>
<p>Each dot represents a place where the phone was when an app accessed its location – either a lat/lon coordinate or a beacon. Even if your app just detected a beacon, the map coordinate of where it did this is plotted.</p>
<p>For many apps, the new dialog is unnecessary alarming. Think about an app that checks your location periodically to tell you where you last parked your car. Even if the app doesn’t share the location with anyone, it will need to
track wherever you drive just to know where you last were. Such a dialog implies to the many users that this location information may be being transmitted off the phone for nefarious purposes, even if the app does none of these
questionable things. The knee-jerk user response is to say, “why do they need to track me?” and then deny always permission. They don’t realize how the app works and why background location access is important.</p>
<p>In theory this is where the customizable text message comes in allowing an explanation of purpose. But good luck writing such an explanation in one or two sentences that the user will understand. The more likely response will
be to deny the permission, then complain that the app doesn’t work when they later press the “find my car” button. Here comes a one star review in the App Store!</p>
<p>Even if the user isn’t suspicious about the app, seeing that dialog over and over gets annoying. Unfortunately, the only way to shut it up permanently is by denying background location access by tapping “When in use”. Some folks will certainly pick that option for that reason.</p>
<p>Of course, some apps do behave badly, which is precisely why Apple added this scary-looking dialog. Honest app developers who need to track location in the background for legitimate purposes are collateral damage.</p>
<p>Unfortunately, this is not the only iOS 13 change. When prompting the user for location permission for the app, iOS now offers a third option in addition to “Allow always” and “Allow when in use”: “Allow now”. The “allow now” option will give the app the ability to access your location on this app launch, but not the next one. If the user selects “allow now”, the next time the app is launched the user will be prompted again.</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/ios10-location-promopt.png" alt="" width="320px" /></th>
<th style="text-align: center"><img src="/images/ios13-initial-location-prompt.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>iOS 8-12</em></td>
<td style="text-align: center"><em>iOS 13</em></td>
</tr>
</tbody>
</table>
<p><br /></p>
<p>But what about always access for getting location in the background? Note that the new dialog above doesn’t even give that option. This is true even if the code that presented the dialog by specifically requesting always authorization with <code class="language-plaintext highlighter-rouge">locationManager.requestAlwaysAuthorization()</code>. The user has to grant one of these two options first, and once one is granted, the app may then request always authorization:</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/ios13-location-background-prompt.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>Switching to Always Permission</em></td>
</tr>
</tbody>
</table>
<p>This two-step process is unfortunately cumbersome and will cause many users to be annoyed enough to deny always permission.</p>
<p><br /></p>
<h2 id="android-10-changes">Android 10 Changes</h2>
<p>Changes in Android 10 are a bit of catch-up relative to iOS. Android 10 now brings a new separate background location permission to the platform, and allowing the user to decide whether to grant “all the time” or “only while using the app” location permission.</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/android-9-location-prompt.png" alt="" width="320px" /></th>
<th style="text-align: center"><img src="/images/android-permission-dialog.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>Android 6-9</em></td>
<td style="text-align: center"><em>Android 10</em></td>
</tr>
</tbody>
</table>
<p><br /></p>
<p>Apple introduced the equivalent changes to iOS 8 back in 2014. But for Android apps that were built to expect always having permission to track beacons in the background once the user gives initial consent, this is still a
big change. Even if the app asks for background “all the time” permission, the user might only grant permission only when while using the app. Apps must take care to expect this possibility and detect if the user has only granted “only while using the app” permission, ask the user to change the foreground-only location
permission. The dialog below shows how that would look:</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/android-switch-to-always.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>Switching to All The Time Permission</em></td>
</tr>
</tbody>
</table>
<p><br /></p>
<p>Note that unike iOS, the Android permission request dialogs don’t show you a user-customizable justification section. As a result, it is a good idea to send you own pop-up first, setting the expectations that you are about to ask for location permission, and explaining the proper reasoning.</p>
<p>The second change in Android 10 is the new background usage warning dialog. If your app accesses the location in the background and does not send a visible notification to the user, Android will now issue its own notification to warn the user much like iOS:</p>
<table>
<thead>
<tr>
<th style="text-align: center"><img src="/images/android-background-warning.png" alt="" width="320px" /></th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center"><em>New Android Background Location Usage Warning</em></td>
</tr>
</tbody>
</table>
<h2 id="how-we-got-here">How We Got Here</h2>
<p>For those curious about how we got to this point, here is a table that shows the evolution of location permission changes with operating system releases.</p>
<table>
<thead>
<tr>
<th style="text-align: center">Location Permission Restrictions</th>
<th style="text-align: center">Android 4-5</th>
<th style="text-align: center">Android 6-9</th>
<th style="text-align: center">Android 10+</th>
<th style="text-align: center">iOS 7</th>
<th style="text-align: center">iOS 8</th>
<th style="text-align: center">iOS 9-12</th>
<th style="text-align: center">iOS 13+</th>
</tr>
<tr>
<th style="text-align: center"> </th>
<th style="text-align: center">2013</th>
<th style="text-align: center">2015</th>
<th style="text-align: center">2019</th>
<th style="text-align: center">2013</th>
<th style="text-align: center">2014</th>
<th style="text-align: center">2016</th>
<th style="text-align: center">2019</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: center">Dynamic Prompt Required?</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td style="text-align: center">Background/Foreground Separate?</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td style="text-align: center">Background Use Warning Dialog</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td style="text-align: center">Background Use Warning w/ Map</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td style="text-align: center">Allow Once Offered</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td style="text-align: center">Background Requires Second Step</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
</tr>
</tbody>
</table>
<p><br /></p>
<p>As you can see, Android 10 is currently where iOS 8 was in 2016. Is that a bad thing? Maybe – if you think Apple’s onerous new permissions process and frightening map dialog are a good thing.
For lemming-like users who tend to grant any permissions requested just to play a sketchy game app, Apple’s approach might be a good thing. But for thoughtful app developers trying to make apps that legitimately use location in the background, Apple’s new restrictions are nothing less than a nightmare.</p>
<hr />
<p>This post as been updated to reflect that Android 10 shows a warning dialog for background usage.</p>
Broken Connection No More2019-10-08T00:00:00+00:00http://davidgyoungtech.com/2019/10/08/broken-connection-no-more<h2 id="hope-for-androids-bluetooth-le-reliability-problem">Hope for Android’s Bluetooth LE Reliability Problem?</h2>
<p>Android has long been known to have problems establishing and maintaining Bluetooth LE connections. In a previous post,
I shared data showing that <a href="/2019/05/21/broken-connection">Android has a stunning 20 percent failure rate</a> connecting to a GATT service under real-world conditions,
compared to just 3 percent on iOS.</p>
<p>It’s not clear all the factors cause this reliability problem, but one known contributor is Android’s use of a crazy setting for the Bluetooth link supervision timeout. This
setting decides how long to wait after losing a contact with a BLE peripheral before Android gives up. For Android 4.3-9.x, this was hard-coded to 20 seconds.
This is crazy long period compared with the iOS setting of 750 milliseconds. Because it is hard-coded in Android, you cannot change this setting. As a result, Android apps can’t be notified of a broken BLE connection
until 20 seconds have passed. What’s worse, client code can’t even try to reconnect until that 20 seconds is up. For a user staring at a spinner on a screen,
20 seconds is an eternity.</p>
<p>The effect of this setting was noted by Andreas Schweizer in a <a href="https://blog.classycode.com/a-short-story-about-android-ble-connection-timeouts-and-gatt-internal-errors-fa89e3f6a456">blog post</a> two years ago, and several folks have reported the problem to Google as a bug.</p>
<p>Not much changed until last month when Google released Android 10. This is the first Android version to include a year-old <a href="https://android.googlesource.com/platform/packages/apps/Bluetooth/+/d32b2d46167122e876455ed70598b331fc692771%5E%21/#F0">commit that reduces the supervision timeout from 20 seconds to 5 seconds.</a>
It’s still hard-coded, and five seconds is still quite a long time compared with the 750ms on iOS, but it is at least an improvement that offers hope for more reliable Bluetooth LE connections on Android.</p>
<p>Using the same data set I used to calculated the 20 percent GATT connection failure rate on Android, very preliminary results with Android 10 devices offer hope for improvement. The numbers so far are super small – only 7 devices with Android 10 have
used the GATT service so far, but every one of those attempts was successful. Cross your fingers that those very early numbers hold that trend.</p>
<p>Don’t get to excited, though. Even if Android 10 does offer significant reliability improvements, it will take years before that update gets to most users. As we all know, Android has a huge problem with OEMs who rarely or never upgrade the operating system on their handsets.
If past experience is a guide to the future, it will be five years before 90 percent of Android devices have Android 10. That’s how long it took to reach the 90 percent distribution of Android 5.0 that exists today.</p>
<p>Five years is a long time to cross your fingers. But with luck, by 2024, the worst of our Android Bluetooth connection reliability problems may be behind us.</p>
<p><strong>UPDATE: April 13, 2020</strong>: Bad news: several months of data collection has not shown statistically significant improvement in BLE connection reliability on Android 10.</p>
Broken Connection2019-05-21T00:00:00+00:00http://davidgyoungtech.com/2019/05/21/broken-connection<h2 id="androids-bluetooth-le-reliability-problem">Android’s Bluetooth LE Reliability Problem</h2>
<p>Building Bluetooth apps on Android has always been tricky. Those of us who have worked on Andriod since its earliest days have been hounded by frustrations of the platform. These range from buggy Bluetooth stacks to fragmentation caused by manufacturers that have to build their phones just a little differently.</p>
<p>Most complaints about Bluetooth on Android are vague and anecdotal. With no hard data to back up the problems, it is never clear if the real problem stems from operator, a buggy app, or the hardware of a crappy off-brand phone. It has always been easy to dismiss the complainers as malingerers. But months of real world data collection across a variety of devices provides evidence that Android itself is the real problem.</p>
<p>The data show that even at close range, Android Bluetooth connections fail a whopping 20 percent of the time, compared to less than 2 percent on iOS. This sizable difference is big enough to feel in daily use. On iOS, Bluetooth connection just work. On Android, they usually work, but they aren’t as reliable.</p>
<p><img src="/images/broken_connection/image4.png" style="width: 624px; height: 385px" />
<img src="/images/broken_connection/image2.png" style="width: 624px; height: 385px" /></p>
<p>The first graph above shows the percentage success and failure rate on each platform. The second graph shows the raw counts of successes and failures recorded. The total numbers are lower on Android because most users are on the iOS platform.</p>
<h3 id="data-collection">Data Collection</h3>
<p>Using a local automated parking system, I collected the data between late 2016 and early 2019. The system provides automated commercial garage access using mobile phones to open the entry and exit gates. The apps for iOS and Android use mobile data networks to authorize gate openings when a mobile network is available. But often times, when exiting a cavernous parking garage, cell connectivity is spotty at best. That’s where Bluetooth comes in. We fitted garages with LInux-based bluetooth relays near the exits hosting a custom Bluetooth LE GATT service. When the mobile app can’t reach the server, it connects to this Bluetooth service to authorize opening the gate, allowing the customer to exit the garage.</p>
<p>This usually works great. Within a couple of seconds of tapping an exit button, the gate simply goes up, and a happy customer drives away.</p>
<p>Sometimes, however, this fails. Anecdotal reports of drivers getting error messages (especially on Android) led me to collect metrics in the app and report them to a server.</p>
<p>The data collected include not just success/fail status, but the specific failure condition as well as the phone manufacturer, model, and operating system version. This allows us to compare failure rates across not just Android and iOS, but across different operating systems and versions.</p>
<p>The data show that there is no significant difference across different iPhone models and iOS versions. All perform quite well with very low failure rates of less than 2 percent.</p>
<p>On Android, much higher failure rates are the norm, even on high end phones and newer operating system versions. Surprisingly, success rates on Android 6-9 are not noticeably better than Android 5. This is bad news as it indicates the situation on Android is not improving with newer operating system releases.</p>
<p><img src="/images/broken_connection/image5.png" style="width: 624px; height: 385px" />
<img src="/images/broken_connection/image7.png" style="width: 624px; height: 385px" /></p>
<p>Newer Android hardware models also do not show reliability improvements. Grouping the Samsung data by hardware generation (e.g. Galaxy Note 8, Galaxy S8 Edge, and Galaxy S8 are lumped together as generation 8) show the failure rate is not improving.</p>
<p><img src="/images/broken_connection/image3.png" style="width: 624px; height: 385px" />
<img src="/images/broken_connection/image6.png" style="width: 624px; height: 385px" /></p>
<p>The detailed failure rates across Android models and operating system versions should be taken with a grain of salt. The data were collected at garages in and around Washington, DC where iPhones are far more prevalent than Android models, making the Android data points relatively sparse.</p>
<p><img src="/images/broken_connection/image1.png" style="width: 624px; height: 385px" />
<img src="/images/broken_connection/image8.png" style="width: 624px; height: 385px" /></p>
<p>The more you drill down into the Android data, the fewer data points you have, making the statistical significance questionable. So while it is possible to conclude with confidence that Android is far worse than iOS and that things are not getting better, it is hard to say anything about which Android models and manufacturers are better than others.</p>
<p>For this reason, two versions of each graph are shown, one with total counts of successes and failures, and the other showing percentages. But because we have very few samples for the new Galaxy S10 or for Huawei and Yulong devices (rare in Washington, DC), don’t jump to conclusions about these outlier success and failure rates. When you see a count below 50 or so in the second graph, know that the percentages in the first graph really don’t mean anything.</p>
<p>Those caveats aside, for broader conclusions this data set is very useful. The Bluetooth service is identical in all cases. The radio distance and operating conditions are all very similar. The one variable that changes is the type of phone.</p>
<h3 id="whats-wrong-with-android">What’s Wrong With Android?</h3>
<p>Clearly the problem is with Android as a whole, not with specific manufacturer implementations. There are a number ways the app can fail to communicate with the Bluetooth service, but looking at the error codes, by far the most common problem is establishing and maintaining a Bluetooth LE connection. On both iOS and Android, a Bluetooth LE GATT service communication sequence involves a number of steps:</p>
<ol>
<li>
<p>The phone scans for and detects a nearby Bluetooth device</p>
</li>
<li>
<p>The phone connects to the device</p>
</li>
<li>
<p>The phone exchanges data with the device</p>
</li>
<li>
<p>The phone disconnects</p>
</li>
</ol>
<p>On Android, the vast majority of the failures are on step 2. The connection attempts typically time out, although they sometimes fail with a connection error. The second most common problem is an unexpected disconnect while performing step 3. When either of these problems happen, the app is programmed to retry (up to 10 times), and it is not uncommon for the connection to fail several times in a row, or to have to repeatedly reconnect after an unexpected disconnect. These same things happen on both Android and iOS, but on Android these connection problems happen much, much more frequently.</p>
<p>What’s more, the success/failure rates mentioned above are not counted as failures in the data set used for comparison in this article if the data exchange ultimately succeeds on retires. Anecdotal testing shows that retries on iOS are rare but on Android extremely common. This means the raw Android failure rate is probably higher than 20 percent, and even in success cases, the common retries cause Android operations to be much slower than iOS.</p>
<p>Without a doubt, the Android Bluetooth stack has a much higher failure rate establishing a Bluetooth LE connection than iOS, and once it establishes one, it is much more likely to drop the connection.</p>
<p>I don’t know the specific reasons why this is true. A Bluetooth packet sniffer might reveal some insights, but it certainly wouldn’t help anything without changes to Android. These problems have been there since Bluetooth LE support was added in version 4.3 nearly six years ago. Clearly, fixing this is not high on Google’s priority list. Knowing the details of why it is unreliable won’t change anything for apps using public versions of Android.</p>
<p><strong>UPDATE: <a href="/2019/10/08/broken-connection-no-more">Changes in Android 10 offer hope for the future</a></strong></p>
<h3 id="is-bluetooth-useless-on-android">Is Bluetooth Useless on Android?</h3>
<p>Unreliable as Bluetooth LE connections are, there is still plenty of use for Bluetooth on Android. Lots of functions like speakers use Bluetooth classic functionality and has nothing to do with Bluetooth LE connections. Not all Bluetooth LE use cases even require connections. Bluetooth LE Beacons, for example, are connectionless and work quite well on Android.</p>
<p>Some applications that rely on rare Bluetooth LE connections, like configuring a fitness tracker, work fine. You might have to hit retry a few times before it works. That is a bit annoying, but certainly acceptable.</p>
<p>But for use cases where frequent Bluetooth GATT connections are required and they needs to be fast and highly reliable, Android will likely disappoint. Imagine the stress of repeatedly seeing “connection failed” while tapping the exit button to get out of a parking garage as the cars keep lining up behind you. At least the driver can hit the intercom button for assistance. Other use cases don’t have that option.</p>
<h3 id="what-you-can-do">What You Can Do</h3>
<p>If you are designing a system using Bluetooth LE on Android, and it needs to be highly reliable, avoid Bluetooth LE connections if possible. Can you use one-way beaconing to accomplish your goal? If you can, do it.</p>
<p>If not, can you use two-way beaconing to accomplish your goal? While difficult to design, I have seen this approach work reliably. Unfortunately it is limited to low data throughput use cases.</p>
<p>If you cannot avoid Bluetooth LE connections, ask yourself this question: Is a 20 percent failure rate acceptable for my use case? Will my Android users be understanding if they sometimes have to hit retry a few times, and even then it still might not work? For some use cases, this still might be acceptable.</p>
<p>But for use cases where a high degree of reliability is critical, don’t even try it. The common approach to Bluetooth development is to build iOS first and then expect Android to be “just the same.”. That approach will not make you, your boss, nor your customers happy.</p>
<h3 id="addendum-methodology-discussion">Addendum: Methodology Discussion</h3>
<p>For those interested in further details about how the data were collected and a discussion of other possible explanations for the discrepancy between iOS and Android, read on.</p>
<p>One other thing that all Android devices used in this data set have in common that could contribute to the problem is the Android app code itself. Perhaps it is not Android at fault, but the programmer (me) who built the closed-source app in a way that made it buggy and unreliable. Like most engineers, I am always paranoid about possibilities like this. After reviewing Android BLE GATT programming best practices and pitfalls, I rewrote the app’s Android Bluetooth client to make extra sure these best practices were followed to the letter, being especially careful about thread handling. I then released an update to the Google Play Store and crossed my fingers for the failure rates to go down. They did not. While a few specific errors did decrease in frequency or go away entirely (those associated with threading on Android’s fiddly Bluetooth LE APIs), these were rare. What’s more, the fixes did not significantly affect the overall Android numbers. While other undiscovered bugs may certainly remain, the fact that I also wrote the reliable iOS app suggests that programmer error is unlikely to explain the difference between the iOS and Android error rates.</p>
<p>Could the Linux service be the culprit? Perhaps Linux does not like communicating with an Android client. The Linux computer is a Raspberry Pi 3 with a built in BLE controller, running the BlueZ Bluetooth stack, and the Node JS Bleno GATT module. The same code also runs on MacOS with a different Bluetooth chip, and a different Bluetooth stack. With the service running on MacOS, the system shows similar Android connection problems. I have also seen that these kinds of connection problems exist for Android on other projects I have built, using Bluetooth services hosted on Windows, iOS, MacOS, and embedded devices.</p>
The Rise Of The Nasty Forks2019-04-30T00:00:00+00:00http://davidgyoungtech.com/2019/04/30/the-rise-of-the-nasty-forks<p>When Google released the first Android phone back in 2010, one of the primary selling points to the geekiest among us was the openness of the platform. Unlike Apple’s closed-source walled garden known as iOS, Android was open source free for anyone to review and modify, and it was such more flexible in the way it could be used. Far from open, iOS was by contrast riddled with maddening rules about what apps were and were not allowed to do.</p>
<p>Over the years, this openness was abused by many app developers. Unlike iOS, Android allowed apps to spawn constantly running background services, burning the CPU, powering up the GPS, and transmitting loads of data over the cell connection. The end result was much shorter battery life on Android devices.</p>
<p>Google began to get serious about combating these abuses starting with Android 6. New features like Doze and App Standby helped tame battery usage, and in Android 8, Google finally outlawed the long-running background service.</p>
<p>For several phone manufacturers, especially in China, these changes were too late. An especially large crop of abusive battery-draining apps hit the Chinese market well before Google’s first battery saving changes in Android 6. </p>
<p>Chinese manufacturers, many building low-end devices most susceptible to battery drain, took matters into their own hands. They deeply forked their Android implementations to go beyond a simple UI skin modifications and add in power management features that brute force killed apps running in the background for more than a few minutes, blocking them from starting again without a manual launch.</p>
<p>Just like Apple’s iOS, these forked operating system changes are closed source. But worse than iOS, the rules are either very poorly documented or not documented at all. That makes building apps that work within the varying rules extremely difficult, essentially requiring reverse engineering each forked OS version. And worse of all, the specific rules are different for each manufacturer.</p>
<p>Back before the OnePlus One burst on the scene, this was a problem only in the Chinese market. But since then, Oppo the maker of the OnePlus line, as well as Huawei and Xiaomi have expanded sales to global markets. Finnish upstart HMD Global, building off of cheap Chinese reference designs and base firmware, has recently joined this party, selling phones under the Nokia brand, but with similar Chinese power saver badness built-in thanks to firmware supplied by contract manufacturers based in China. Today, forked Android power savers are a global problem.</p>
<p>The folks at dontkillmyapp have done a great job of documenting restrictions as they become known. You can check out their work here: <a href="https://dontkillmyapp.com">https://dontkillmyapp.com</a></p>
<p>So why doesn’t this keep your email from synchronizing? The three big Chinese manufacturers all work with app whitelists. Apps that are on the whitelist (think Gmail, Spotify, Twitter, etc.) are exempt from restrictions that keep them from running in the background. This is why you are still able to get email notifications on a Huawei phone. But any app not lucky enough to be on the whitelist gets killed quickly after leaving the foreground. It doesn’t matter if you follow all of Google’s open source Android rules and best practices for battery-friendly background activity: using Foreground Services and the Job Scheduler. On these custom Android forks, your app will be killed anyway.</p>
<p>What can you do about it? Nothing, short of warning your app users. </p>
<p>You can detect if your app is running on a phone made by Oppo, Huawei or ZTE and tell users that certain background features provided by your app won’t work. You might tell the user that the features can be enabled by whitelisting the app. You can even give instructions telling users exactly how to do this.</p>
<p>But <em>most</em> users will never do this. They will ignore warnings and instructions until the app fails to deliver promised functionality. Then they will leave you a one star review in the Play Store with a terse line saying the app doesn’t work. </p>
<p>The brute force option is to block your app from being downloaded from the Google Play Store into devices made by these manufacturers. A slightly less draconian alternative is to design a second “foreground only” experience for the app that gets triggered when it is run on such Chinese devices. But this approach runs the risk of getting bad reviews if the Google Play listing makes promises for mainstream phones that can’t be delivered on Chinese phones. This latter problem argues for a completely different app.</p>
<p>What you should <strong>not</strong> do is let your customers and your bosses think that your app can work the same way on these devices. What you should also not do is bang your head against the desk trying to force your app to work on these devices when the manufacturer is secretly working against you. Either approach is doomed to frustration and failure.</p>
Capturing Mobile Phone Numbers2018-07-13T00:00:00+00:00http://davidgyoungtech.com/2018/07/13/capturing-mobile-phone-numbers<p>Ever want to write an iOS app that determines the phone number of the device on which it is running? Well, it turns
out you can’t, at least not with iOS APIs. But throw a cloud server into the mix and there is a way!</p>
<h2 id="the-problem">The Problem</h2>
<p>The problem is that Apple disallows iOS apps from accessing the device phone number for privacy reasons.<br />
Clearly, the idea is to block sketchy app developers from harvesting your phone number and selling it to
spammers. But in their typical heavy-handed approach, Apple blocks all access to the device phone number. An
app cannot get it even if they ask the user permission.</p>
<p>Frustratingly, this same restriction applies to enterprise devices, so corporate apps can’t even access the phone
number of the device that the company itself owns! It used to be that enterprise apps could use private APIs to
access this functionality (entrprise apps don’t need to go through App Store review, so they can get away with using
private APIs) but as of iOS 11, Apple has locked down all known private APIs to access the phone number so these cannot
be used anymore.</p>
<p>Like it or not, this blocks legitimate uses of phone number capture. A common workaround is to ask the user to type in their
phone number, something that is tedious, error-prone, and subject to users intentionally providing false numbers.
WhatsApp, for example, requires you to type in your phone number, then go through a process to verify the phone number by having their
servers send you a SMS code that you are later required to enter to complete your registration.</p>
<h2 id="the-solution">The Solution</h2>
<p>Fortunately, there is an alternative that requires no data entry by the user and does a pretty good job of ensuring that
the phone number captured is for the iPhone – or at least for a mobile phone in the user’s possession. The idea is to have the app
send a SMS message to a server with the app’s unique installation code. The app can then query the same server to see if it has
recently received a SMS message from a device with this unique app installation code. If it has, it can read the phone number that
sent it. Here’s a <a href="https://vimeo.com/269664301">demo video</a> showing the process.</p>
<p>The good news is that using tools at AWS, you can set up this whole process for free. Amazon will assign a “long code” phone
number to your AWS account upon request. In theory these are limited to sending a maximum of 200 messages per day – but for this
use case, we don’t send any messages – everything is inbound! So the whole process can work on a free-tier at Amazon. Of
course, if you process a huge volume of SMS messages, you’ll start incurring costs on your AWS computing resources. But that
would require an app with many millions of users to start incurring any significant charges.</p>
<p>The main disadvantage is that the user must still send the SMS. The iOS app can bring up a view with a pre-formatted message to send to the server, but
it can’t actually end the SMS unless the user hits the send button. The user could, of course, choose not to send it. The user could edit the message
so the app install identifier is changed. The user could also look at the message, and then send it from a different phone. So this solution can’t force
users to give you their phone number. But it can make it super easy for users who do want to share it to do so.</p>
<h2 id="aws-architecture">AWS Architecture</h2>
<p>The diagram below shows the AWS components that are needed do this for you.</p>
<p><img src="/images/phone_number_capture.png" alt="architecture" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Using “AWS Pinpoint” you can request a free 10 digit US phone
number to receive SMS messages. Amazon then lets you set up a Simple Notification Service (SNS) “Topic” and configure it to receive all of these
incoming SMS messages. The messages can then be configured to flow into a AWS Lambda. That’s basically a tiny “serverless” cloud app that executes
to do something with the SNS data whenever it comes in. We’ll have that Lambda insert the phone number and the installation identifier from the app
that sent the SMS message into a Dynamo database.</p>
<p>The system also uses a second Lambda to query the phone number from the database, and front it with a AWS “API Gateway” so that our iOS app can ask our our
Amazon cloud system to check if a SMS has come in for the device, and if so, what phone number it came from. By polling this endpoint for a brief time
after we send the SMS, we will get back the phone number as soon as it arrives at the server. In my tests, this
whole process takes about 5-10 seconds.</p>
<h2 id="setting-this-up-on-aws">Setting this up on AWS</h2>
<h3 id="step-1-request-a-phone-number">STEP 1: Request a phone number</h3>
<p>First, we’ll need a phone number on AWS to receive our SMS messages. For traditional
10-digit US phone numbers (also known as a “long code”), this is absolutely free, but you
are limited to sending 200 messages per day. If you want to go beyond this, you need a
short code, for which you must pay. For our use case, we will not be sending any messages
at all, so this works fine.</p>
<ol>
<li>Go to https://console.aws.amazon.com and log in or create an account</li>
<li>Fill out a request to get a “long code”” phone number assigned to your AWS account as described <a href="https://docs.aws.amazon.com/pinpoint/latest/userguide/channels-sms-awssupport-long-code.html">here</a> (Note that you can only have 5 long codes associated with your account.)</li>
<li>Expect to receive a half dozen questions about your AWS ticket to request the long code. The agent
assigned to your case will be trying to determine if you are a SMS spammer. Your job is to reply and convince them
you are not.</li>
<li>Wait to get a number assigned.</li>
</ol>
<p>When I made my request, Amazon said, “Your Dedicated Long Code for US destinations has been moved to the implementation stage. This process can take 2 to 3 weeks. We will send another message when the implementation is complete.”
Wow, that’s slow! Fortunately, they did better in practice. I requested a number on Friday evening and had it assigned by Tuesday evening. If you want to move forward before Amazon completes this assignment, and you have access to an Android device, you can use my <a href="https://github.com/davidgyoung/sms2sns">free SMS to Amazon SNS forwarder app</a> that will let you use your Android device’s phone number to forward SMS messages to AWS.</p>
<h3 id="step-2-make-a-sns-topic">Step 2: Make a SNS Topic</h3>
<p>We will use this Simple Notification Service topic to receive any messages from our SMS
number. The SNS topic allows us to hook in to other AWS services from SMS. For details,
see <a href="https://docs.aws.amazon.com/pinpoint/latest/userguide/settings-account.html#settings-account-sms-number-2way">here</a></p>
<ol>
<li>Log in to https://console.aws.amazon.com</li>
<li>Tap Application Integration -> Simple Notification Service</li>
<li>Tap Create Topic</li>
<li>Fill out the following fields:
Topic: phonenumbercatcher,
Display name: (blank)</li>
<li>
<p>Tap Create</p>
<p><img src="/images/create_sns_topic.png" alt="create sns topic" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="step-3-create-a-new-dynamodb-table">Step 3: Create a new DynamoDB Table</h3>
<p>This database table will hold the phone numbers captured</p>
<ol>
<li>Log in to https://console.aws.amazon.com</li>
<li>Tap Database -> DynamoDB -> Create Table</li>
<li>Set the following values:
table name: DevicePhoneNumbers
primary_key: DeviceUuid</li>
<li>Tap Create</li>
<li>
<p>Wait for the table creation to finish.</p>
<p><img src="/images/create_dynamodb.png" alt="create dynamo db" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="step-4-create-a-new-lambda">Step 4: Create a new Lambda</h3>
<p>This lambda will be responsible for inserting new rows into the database whenever a new
SMS message comes in. The database will hold the origination phone number, the device
identifier and some timestamps about when the message came in. The Lambda code is in Node.js,
because it is by far the easiest for integration – you can simply paste the code into a text
block. The same functionality can be implemented with Java, Go, Python or other AWS supported
languages.</p>
<ol>
<li>Log in to https://console.aws.amazon.com</li>
<li>Tap Compute -> Lambda -> Create Function</li>
<li>
<p>Select “Author from scratch” then enter the following values:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Name: phoneNumberCatcher
Runtime: Node.js 6.10
Role: Create new role from template(s)
Role Name: phoneNumberCatcherRole
</code></pre></div> </div>
</li>
<li>Under Policy Templates, choose “Simple Microservice Permissions”, and “Dynamo DB Full Access”</li>
<li>
<p>Tap Create function</p>
<p><img src="/images/create_lambda.png" alt="create labmda" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
<li>Once the Lambda is created, you’ll be presented with a screen where you can actually paste in the code we want to execute. Since we have selected Node.js, we can paste a simple code snippet inline that will take the parameters from SNS and insert them into our DynamoDB table we made above. <br />
Copy and paste the following code and put it into the code entry field: (Paste code from <a href="https://github.com/davidgyoung/phone-number-capture-ios/blob/master/AWS/PhoneNumberCatcher.js">PhoneNumberCatcher.js</a>)</li>
<li>
<p>Once it is there, hit the orange Save button in the upper right.</p>
<p><img src="/images/lambda_code_edit.png" alt="labmda code edit" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="step-5-hook-up-sns-to-the-lambda">Step 5: Hook up SNS to the Lambda</h3>
<p>This configuration will make it so the Lambda above is executed each time a new SNS message is added (which
comes from SMS.)</p>
<ol>
<li>Go to https://console.aws.amazon.com</li>
<li>Tap app integration -> Simple Notification Service</li>
<li>Tap on topics</li>
<li>Check the checkbox next to the phonenumbercatcher topic</li>
<li>
<p>Hit the Actions button and choose “Subscribe to topic”</p>
<p><img src="/images/subscribe_to_topic.png" alt="subscribe to topic" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
<li>In the dialog that pops up, choose the following:
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Protocol: AWS Lambda
Endpoint: phoneNumberCatcher (choose yours from the picklist)
Version or Alias: default
</code></pre></div> </div>
</li>
<li>
<p>Tap “Create subscription”</p>
<p><img src="/images/subscribe_to_topic2.png" alt="subscribe to topic part 2" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="step-6-test-sns-integration-with-your-database">Step 6: Test SNS integration with your database</h3>
<ol>
<li>Return to the SNS console as in the previous step, and tap on the phonenumbercatcher topic, then hit the “Publish to Topic” button at the top of the screen</li>
<li>
<p>Edit the following fields:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Subject: (leave this blank)
Message Format: JSON
Message:
{
"default": "{\"originationNumber\": \"+1XXX5550100\",\"messageBody\": \"device_uuid:abcd123456\",\"inboundMessageId\":\"cae173d2-66b9-564c-8309-21f858e9fb84\",\"messageKeyword\": \"device_uuid\",\"destinationNumber\": \"+1XXX5550199\"}"
}
</code></pre></div> </div>
<p>The message you see above has a bunch of backslashes in it because it is JSON encoded inside a string. The “default” key tells AWS what the SNS the message should be for default processors. The value must be a string. In order to send the same kind of JSON data inside this string that a SMS message would send, we have to put backslashes in front of all the strings in our JSON data. For now, don’t worry about this too much. Just trust that this is what the SNS message will look like when it gets converted from a SMS message sending the text message “device_uuid:abcd123456”.</p>
</li>
<li>
<p>Scroll to the bottom of the screen at tap “Publish Message”.</p>
<p><img src="/images/topic_test.png" alt="topic test" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<p>If all goes well, this should insert a new row into the DynamoDB. To check this:</p>
<ol>
<li>Go to https://console.aws.amazon.com</li>
<li>Tap Database -> DynamoDB -> Tables, and select your table from the list</li>
<li>
<p>Tap the “Items” tab. If it worked, you should see one row in the table with the phone number and device uuid.</p>
<p><img src="/images/database_result.png" alt="database test result" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="troubleshooting">Troubleshooting</h3>
<p>If you don’t see the expected results in the previous section, it’s time to troubleshoot. You can do this by checking the CloudWatch logs, which get generated whenever our lambda is invoked.</p>
<ol>
<li>Go to https://console.aws.amazon.com</li>
<li>Tap Management Tools -> CloudWatch, then hit the Logs menu item in the left-hand column</li>
<li>You should see a list that includes /aws/lambda/phoneNumberCatcher. If you do, tap on it. If you don’t, then this means your lambda is not being invoked. Go back to the “Hooking up SNS to the Lambda” section and verify everything is set up properly.</li>
<li>You should see a list of log files by timsteamp. Tap on the one with the latest timestamp, and look at the entries for any clues about what went wrong. Once you fix any setup errors, go back to the previous section and test again until you have it working correctly.</li>
</ol>
<h3 id="step-7-querying-for-the-phone-number">STEP 7: Querying for the phone number</h3>
<p>So far, we’ve built something that can take incoming phone numbers and device UUIDs and throw them into a database, but we have no way to get them out. What we now need is a web service that our app can call to get the phone number from our DynamoDB based on its device UUID. For that, we’ll make another lambda that simply queries the database.</p>
<ol>
<li>Log in to https://console.aws.amazon.com</li>
<li>Tap Compute -> Lambda -> Create Function</li>
<li>
<p>Select “Author from scratch” then enter the following values:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Name: PhoneNumberQuery
Runtime: Node.js 6.10
Role: Create new role from template(s)
Role Name: phoneNumberQueryRole
</code></pre></div> </div>
</li>
<li>Under Policy Templates, choose “Simple Microservice Permissions”</li>
<li>Tap Create function</li>
<li>Just like before, once the Lambda is created, you can paste in this code: (Paste code from <a href="https://github.com/davidgyoung/phone-number-capture-ios/blob/master/AWS/PhoneNumberQuery.js">PhoneNumberQuery.js</a>)</li>
<li>
<p>Tap Save</p>
<p><img src="/images/create_lambda_query.png" alt="create lambda query" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="step-8-creating-an-api-gateway">Step 8: Creating an API Gateway</h3>
<p>An API gateway exposes your Lambda as a web service. Create one like this:</p>
<ol>
<li>Go to https://console.aws.amazon.com/</li>
<li>Select Networking and Content Delivery -> API Gateway</li>
<li>Choose to Create a New API.</li>
<li>
<p>On the API creation screen fill out the following fields:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Type: New API
API Name: PhoneNumberQueryAPI
Description: (leave blank)
Endpoint Type: Regional
</code></pre></div> </div>
</li>
<li>
<p>Tap “Create API”</p>
<p><img src="/images/create_api_query.png" alt="create api query" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
<li>You will see an API editor screen. Under the “Actions” pull down menu, choose “Create” Method, then in the picklist choose “POST”.</li>
<li>Update the following fields:
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Integration Type: Lambda
Lambda: PhoneNumberQuery
Lambda Proxy Integration: CHECKED
</code></pre></div> </div>
</li>
<li>
<p>Tap “Save”</p>
<p><img src="/images/create_post.png" alt="create post" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
<li>
<p>Using the “Actions” pull-down menu, select Deploy. In the dialog that pops up, enter:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Deployment stage: [New Stage]
Stage name: test
Stage description; (leave blank)
Deployment description (leave blank)
</code></pre></div> </div>
</li>
<li>
<p>Tap Deploy</p>
<p><img src="/images/deploy_api.png" alt="deploy api" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
<li>
<p>Wait for the spinner to complete. When done, you’ll see a new stage has been created, and the URL for your resource will be available. It should give you an invoke URL that looks something like this:
https://asdfasdfaa.execute-api.us-east-1.amazonaws.com/test</p>
<p><img src="/images/staging_url.png" alt="staging url" style="width:750px;border-style:solid;border-width:5px;" /></p>
</li>
</ol>
<h3 id="testing-the-lookup-api">Testing the Lookup API</h3>
<p>You can use the <code class="language-plaintext highlighter-rouge">curl</code> command line tool (Mac or Linux, or in Cygwin on Windows) to test to see if the API works to look up a device by uuid. Before you try the command below, be sure to replace the URL with the URL received from the configuration in the last step.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>$ curl -XPOST https://REPLACEWITHYOURPREFIX.execute-api.us-east-1.amazonaws.com/test -d '{"device_uuid":"abcd123456"}'
{"device":{"lambda_receive_time":"Mon Apr 30 2018 18:12:16 GMT+0000 (UTC)","sns_publish_time":"2018-04-30T18:11:19.909Z","origination_number":"+1XXX5550100","device_uuid":"abcd123456"}}
</code></pre></div></div>
<p>The second line is an example of the output you might see if it works.</p>
<h2 id="connecting-to-this-from-ios">Connecting to this from iOS</h2>
<p>On the iOS side, the code needed to do this is pretty simple. You basically need to do three things:</p>
<ol>
<li>Generate an app identifier</li>
<li>Send an SMS message to a known phone number at Amazon, with the above identifier</li>
<li>Repeatedly sent an HTTP request to try to read the captured phone number from Amazon</li>
</ol>
<h3 id="generating-an-app-identifier">Generating an app identifier</h3>
<p>While iOS does have unique identiers like the UDID (universal device ID) and and IDFA (ID for advertisers),
the easiest way to get an app identifier is to use the iOS UUID generator and save it off to
persistent storage. This way, the first time your app is run it will generate a UUID and then keep
using it forever. The code below does this:</p>
<div class="language-javascript highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="kd">var</span> <span class="nx">deviceUuid</span><span class="p">:</span> <span class="nb">String</span> <span class="p">{</span>
<span class="kd">get</span> <span class="p">{</span>
<span class="k">if</span> <span class="kd">let</span> <span class="nx">val</span> <span class="o">=</span> <span class="nx">UserDefaults</span><span class="p">.</span><span class="nx">standard</span><span class="p">.</span><span class="nx">string</span><span class="p">(</span><span class="na">forKey</span><span class="p">:</span> <span class="dl">"</span><span class="s2">deviceUuid</span><span class="dl">"</span><span class="p">)</span> <span class="p">{</span>
<span class="k">return</span> <span class="nx">val</span>
<span class="p">}</span>
<span class="kd">let</span> <span class="nx">val</span> <span class="o">=</span> <span class="nx">UUID</span><span class="p">().</span><span class="nx">uuidString</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">deviceUuid</span> <span class="o">=</span> <span class="nx">val</span>
<span class="k">return</span> <span class="nx">val</span>
<span class="p">}</span>
<span class="kd">set</span> <span class="p">{</span>
<span class="nx">UserDefaults</span><span class="p">.</span><span class="nx">standard</span><span class="p">.</span><span class="kd">set</span><span class="p">(</span><span class="nx">newValue</span><span class="p">,</span> <span class="na">forKey</span><span class="p">:</span> <span class="dl">"</span><span class="s2">deviceUuid</span><span class="dl">"</span><span class="p">)</span>
<span class="p">}</span>
<span class="p">}</span>
</code></pre></div></div>
<h3 id="sending-a-sms-message">Sending a SMS message</h3>
<p>Again, you cannot programmatically send a SMS message with iOS code. You can prepare a
message body and a destination phone number and present it to a user for them to approve
and then gesture to send. The code below does prepares this message with the device id
in the message body, populates the destination phone number, then presents the view.</p>
<div class="language-javascript highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span> <span class="o">=</span> <span class="nx">MFMessageComposeViewController</span><span class="p">()</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span><span class="p">.</span><span class="nx">messageComposeDelegate</span> <span class="o">=</span> <span class="nb">self</span>
<span class="c1">// Configure the fields of the interface.</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span><span class="p">.</span><span class="nx">recipients</span> <span class="o">=</span> <span class="p">[</span><span class="nb">self</span><span class="p">.</span><span class="nx">AWSPhoneNumber</span><span class="p">]</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span><span class="p">.</span><span class="nx">body</span> <span class="o">=</span> <span class="dl">"</span><span class="s2">device_uuid:</span><span class="se">\</span><span class="s2">(self.deviceUuid)</span><span class="dl">"</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span><span class="p">.</span><span class="nx">disableUserAttachments</span><span class="p">()</span>
<span class="c1">// Present the view controller modally.</span>
<span class="nb">self</span><span class="p">.</span><span class="nx">present</span><span class="p">(</span><span class="nb">self</span><span class="p">.</span><span class="nx">composeVC</span><span class="p">,</span> <span class="nx">animated</span><span class="p">:</span> <span class="kc">true</span><span class="p">,</span> <span class="nx">completion</span><span class="p">:</span> <span class="nx">nil</span><span class="p">)</span>
</code></pre></div></div>
<h3 id="polling-for-sms-message-receipt">Polling for SMS message receipt</h3>
<p>We can use a URLSession and a URLSessionDataTask to asynchronously call the server to see
if it has gotten the SMS. Here is code that will do that:</p>
<div class="language-javascript highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <span class="kd">let</span> <span class="nx">session</span> <span class="o">=</span> <span class="nx">URLSession</span><span class="p">(</span><span class="nx">configuration</span><span class="p">:</span> <span class="nx">URLSessionConfiguration</span><span class="p">.</span><span class="k">default</span><span class="p">)</span>
<span class="kd">var</span> <span class="nx">dataTask</span><span class="p">:</span> <span class="nx">URLSessionDataTask</span><span class="p">?</span>
<span class="kd">var</span> <span class="nx">request</span> <span class="o">=</span> <span class="nx">URLRequest</span><span class="p">(</span><span class="nx">url</span><span class="p">:</span> <span class="nx">URL</span><span class="p">(</span><span class="nx">string</span><span class="p">:</span> <span class="dl">"</span><span class="se">\</span><span class="s2">(server)</span><span class="se">\</span><span class="s2">(DeviceApi.ServicePath)</span><span class="dl">"</span><span class="p">)</span><span class="o">!</span><span class="p">,</span> <span class="nx">cachePolicy</span><span class="p">:</span> <span class="nx">NSURLRequest</span><span class="p">.</span><span class="nx">CachePolicy</span><span class="p">.</span><span class="nx">reloadIgnoringCacheData</span><span class="p">,</span> <span class="nx">timeoutInterval</span><span class="p">:</span> <span class="nx">TimeInterval</span><span class="p">(</span><span class="mi">10</span><span class="p">))</span>
<span class="nx">request</span><span class="p">.</span><span class="nx">httpMethod</span> <span class="o">=</span> <span class="dl">"</span><span class="s2">POST</span><span class="dl">"</span>
<span class="kd">var</span> <span class="nx">responseError</span><span class="p">:</span> <span class="nb">String</span><span class="p">?</span> <span class="o">=</span> <span class="nx">nil</span>
<span class="kd">var</span> <span class="nx">bodyData</span><span class="p">:</span> <span class="nx">Data</span><span class="o">!</span> <span class="o">=</span> <span class="nx">nil</span>
<span class="k">do</span> <span class="p">{</span>
<span class="nx">bodyData</span> <span class="o">=</span> <span class="k">try</span> <span class="nx">JSONSerialization</span><span class="p">.</span><span class="nx">data</span><span class="p">(</span><span class="na">withJSONObject</span><span class="p">:</span> <span class="p">[</span><span class="dl">"</span><span class="s2">device_uuid</span><span class="dl">"</span><span class="p">:</span> <span class="nx">deviceUuid</span><span class="p">],</span>
<span class="na">options</span><span class="p">:</span> <span class="nx">JSONSerialization</span><span class="p">.</span><span class="nx">WritingOptions</span><span class="p">.</span><span class="nx">prettyPrinted</span><span class="p">)</span>
<span class="p">}</span>
<span class="k">catch</span> <span class="p">{</span>
<span class="nx">NSLog</span><span class="p">(</span><span class="dl">"</span><span class="s2">Can't serialize post data</span><span class="dl">"</span><span class="p">)</span>
<span class="p">}</span>
<span class="nx">request</span><span class="p">.</span><span class="nx">httpBody</span> <span class="o">=</span> <span class="nx">bodyData</span>
<span class="nx">dataTask</span> <span class="o">=</span> <span class="nx">session</span><span class="p">.</span><span class="nx">dataTask</span><span class="p">(</span><span class="kd">with</span><span class="p">:</span> <span class="nx">request</span><span class="p">)</span> <span class="p">{</span>
<span class="nx">data</span><span class="p">,</span> <span class="nx">response</span><span class="p">,</span> <span class="nx">error</span> <span class="k">in</span>
<span class="nx">NSLog</span><span class="p">(</span><span class="dl">"</span><span class="s2">Back from request</span><span class="dl">"</span><span class="p">)</span>
<span class="kd">let</span> <span class="nx">response</span> <span class="o">=</span> <span class="nx">response</span> <span class="k">as</span><span class="p">?</span> <span class="nx">HTTPURLResponse</span>
<span class="kd">var</span> <span class="nx">jsonDict</span><span class="p">:</span> <span class="p">[</span><span class="nb">String</span><span class="p">:</span><span class="nx">Any</span><span class="p">]?</span> <span class="o">=</span> <span class="nx">nil</span>
<span class="k">if</span> <span class="kd">let</span> <span class="nx">data</span> <span class="o">=</span> <span class="nx">data</span> <span class="p">{</span>
<span class="k">do</span> <span class="p">{</span>
<span class="k">if</span> <span class="kd">let</span> <span class="nx">str</span> <span class="o">=</span> <span class="nb">String</span><span class="p">(</span><span class="na">data</span><span class="p">:</span> <span class="nx">data</span><span class="p">,</span> <span class="na">encoding</span><span class="p">:</span> <span class="nb">String</span><span class="p">.</span><span class="nx">Encoding</span><span class="p">.</span><span class="nx">utf8</span><span class="p">)</span> <span class="p">{</span>
<span class="nx">NSLog</span><span class="p">(</span><span class="dl">"</span><span class="s2">JSON from server: </span><span class="se">\</span><span class="s2">(str)</span><span class="dl">"</span><span class="p">)</span>
<span class="p">}</span>
<span class="k">if</span> <span class="kd">let</span> <span class="nx">result</span> <span class="o">=</span> <span class="k">try</span> <span class="nx">JSONSerialization</span><span class="p">.</span><span class="nx">jsonObject</span><span class="p">(</span><span class="na">with</span><span class="p">:</span> <span class="nx">data</span><span class="p">,</span> <span class="na">options</span><span class="p">:</span> <span class="nx">JSONSerialization</span><span class="p">.</span><span class="nx">ReadingOptions</span><span class="p">.</span><span class="nx">mutableContainers</span><span class="p">)</span> <span class="k">as</span><span class="p">?</span> <span class="p">[</span><span class="nb">String</span><span class="p">:</span><span class="nx">Any</span><span class="p">]</span> <span class="p">{</span>
<span class="nx">jsonDict</span> <span class="o">=</span> <span class="nx">result</span>
<span class="p">}</span>
<span class="k">else</span> <span class="p">{</span>
<span class="kd">let</span> <span class="nx">message</span> <span class="o">=</span> <span class="dl">"</span><span class="s2">Cannot decode json due to nil deserilization result</span><span class="dl">"</span>
<span class="nx">NSLog</span><span class="p">(</span><span class="nx">message</span><span class="p">)</span>
<span class="nx">jsonDict</span> <span class="o">=</span> <span class="p">[</span><span class="dl">"</span><span class="s2">error</span><span class="dl">"</span><span class="p">:</span> <span class="nx">message</span><span class="p">]</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="k">catch</span> <span class="p">{</span>
<span class="nx">responseError</span> <span class="o">=</span> <span class="dl">"</span><span class="s2">Cannot decode json due to exception</span><span class="dl">"</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="k">else</span> <span class="p">{</span>
<span class="nx">responseError</span> <span class="o">=</span> <span class="dl">"</span><span class="s2">Response body is unexpectedly nil</span><span class="dl">"</span>
<span class="p">}</span>
<span class="p">}</span>
</code></pre></div></div>
<p>If the above works, then jsonDict will contain our json response from the server that will
be populated inside jsonDict, and you can access it with jsonDict[“device”][“phone_number”] (after appropriate nil checking and typecasting). If we read this
phone number, then everything worked!</p>
<p>You can see a full iOS demo app that accomplishes that in <a href="https://github.com/davidgyoung/phone-number-capture-ios">this repository</a>.</p>
<p>Trouble with this tutorial? Please open an Github issue in the repo above.</p>
Building Voice Bots With Amazon Alexa2018-04-23T00:00:00+00:00http://davidgyoungtech.com/2018/04/23/building-voice-bots-with-amazon-alexa<p>This tutorial gives you an introduction to Amazon’s voice bot technologies which include Amazon Alexa and Amazon Web Services Lex. We’ll discuss the differences between the two with a primary focus on Alexa. To illustrate how the technologies work, we’ll actually show how you how to build a custom Alexa skill. Basic programming knowledge is helpful, although if you’re not a coder, the conceptual sections will still be useful to understanding how voice bot technologies work.</p>
<h2 id="alexa-skills-overview">Alexa Skills Overview</h2>
<p>Building an Alexa skill allows you to add new functionality to Amazon’s Alexa voice assistants, allowing Alexa users around the world to access your new functionality. Building a skill for Alexa is kind of like building an app for a mobile phone. When you make an app, you put it in Apple’s App Store so consumers can install it on their iPhones, or into the Google Play Store so consumers can install it on their Android phones. With Alexa skills, you put the skill into Amazon’s skill catalog, then any owner of Amazon Echo and other Alexa devices can enable the skill through the companion mobile app or the Alexa skills web page.</p>
<div style="float: right; padding-left: 10px;"><img src="/images/bus-alexa.png" alt="Alexa" style="display: block; width:250px;border-style:solid;border-width:5px;" /></div>
<h2 id="alexa-skills-vs-amazon-lex">Alexa Skills vs. Amazon Lex</h2>
<p>Alexa Skills are similar to Amazon Lex voice bots. Both let you create voice-activated digital assistants. Alexa is a consumer service available to the general public, and making a custom Alexa skill basically lets you bolt on your voice assistant features to the same Alexa that millions of people use around the world.</p>
<div style="float: left; padding-right: 10px;"><img src="/images/iphone-bus-status.png" alt="Alexa" style="display: block; width:250px;border-style:solid;border-width:5px;" /></div>
<p>Amazon Lex, on the other hand, is an Amazon web service that you can use to build your own <em>private</em> specialized voice assistant that acts much like Alexa, without its default behavior or even necessarily being publicly available. Using Amazon Lex, for example, you could build a voice assistant just for employees of your company, and only answering your own custom voice commands (so you don’t have to worry about your colleagues using it to order a new toaster from Amazon on the company account.) Lex voice bots are typically used with the iOS or Android Amazon SDKs, so you can add voice assistant features into your own mobile app.</p>
<p>Both Alexa Skills and Amazon Lex use the same basic voice recognition technology and artificial intelligence to convert human speech into something a computer can process. They use the same Amazon Web Service building blocks to make both kinds of voice assistants. The sections below are specific to Alexa, but the same concepts apply. When you hear the terms “Intent”, “Slot” and “Lambda” with respect to Alexa, know that the same concepts apply if building Lex bots.</p>
<h2 id="ways-to-test-your-alexa-skill">Ways to Test Your Alexa Skill</h2>
<p>Alexa skills are consumer products. Much like a mobile app, you have to submit an Alexa skill for certification to Amazon before consumers can find and enable it on their Echo voice assistants.</p>
<p>You’ll need to do plenty of testing of your skill before you get to that point, so Amazon has a couple of web-based tools for developers that let you try out your skill in the browser to get it working before submitting it for review. These tools can be a bit limiting. They cannot access the consumer’s location and are not always listening – you have to type in text or hit a button to get them listening to your voice before they’ll process voice commands. Fortunately you can also use hardware devices for testing including a real Amazon Echo speaker, or the iOS and Android Alexa apps. These hardware devices provide a much more realistic conversation flow. To use hardware devices during development, you simply start a beta test, and send invites to the Amazon account email addresses of the users you want to give it a try on their Echo speakers or Android Alexa apps.</p>
<p>Finally, there is also a more sophisticated browser-based simulator called EchoSim at https://echosim.io/. This more closely simulates an Echo speaker with fancy graphics, but you still have to hold down a button on the web page to get the virtual speaker to listen (there is no “hey, Alexa” trigger). And it doesn’t maintain sessions, so you have to verbally re-activate your skill with each command. To use it to test a custom skill, you have to log in to it with your Amazon account, get invited to the beta test for the skill, and enable the skill on your the EchoSim virtual speaker using the iOS or Android Alexa companion app.</p>
<p>Here’s a summary of the options:</p>
<table>
<thead>
<tr>
<th>Option</th>
<th style="text-align: center">Language parsing?</th>
<th style="text-align: center">Voice testing?</th>
<th style="text-align: center">Access Location?</th>
<th style="text-align: center">Maintains Session?</th>
<th style="text-align: center">Hey Alexa Trigger?</th>
</tr>
</thead>
<tbody>
<tr>
<td>AWS Lambda Tester</td>
<td style="text-align: center">no</td>
<td style="text-align: center">no</td>
<td style="text-align: center">no</td>
<td style="text-align: center">no</td>
<td style="text-align: center">no</td>
</tr>
<tr>
<td>Alexa Simulator</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">no</td>
<td style="text-align: center">poorly</td>
<td style="text-align: center">no</td>
</tr>
<tr>
<td>EchoSim Webpage</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">no</td>
<td style="text-align: center">no</td>
</tr>
<tr>
<td>Alexa Mobile Apps</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">no</td>
</tr>
<tr>
<td>Echo Speaker</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
<td style="text-align: center">yes</td>
</tr>
</tbody>
</table>
<h2 id="creating-a-new-alexa-skill">Creating a New Alexa Skill</h2>
<p>The sections below will walk you through setting up the building blocks needed to create your own custom Alexa skill.</p>
<h3 id="initial-setup">Initial Setup</h3>
<p>Go to https://developer.amazon.com and sign up for a free Amazon developer account. This is distinct from your login to the Amazon Web Services console (which you’ll also need later), so even if you have built Amazon cloud services before and have an AWS account, you’ll also need an Amazon Developer account.</p>
<p>Think of the AWS console as the back-end tool suite that you use to build any Amazon cloud system. The Amazon Developer Console, on the other hand, is where you go to put a consumer-facing interface on something you’ve built in the Amazon cloud. In this case, we’ll be building an Alexa voice interface.</p>
<p>A big part of the setup process involves filling out a web form on Amazon, rather than writing code in some kind of programming language. This makes it easier for beginners to set up, but it is harder to explain the process and if Amazon decides to rework their website (like they did in April of 2018), the screens may change somewhat. The basic concepts, however, remain the same.</p>
<p>Once you are logged in to the Amazon Developer console, you can go to this URL to create a new Alexa skill: https://developer.amazon.com/alexa/console/ask, then click the Create Skill button. You’ll be taken to a screen that allows you to name your skill.</p>
<p><img src="/images/skill-name.png" alt="setting the skill name" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Give the skill the name “Bus Status” then submit the form. You’ll then be asked if you want to use a predefined template to create your skill. We want to create a custom one.</p>
<p><img src="/images/skill-model.png" alt="adding sample utterances" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Choose “Custom”, then hit the create Skill button. This will create a skeleton skill. To set it up so it will do anything, we need to fill out information in four areas of the checklist shown on the screen:</p>
<ul>
<li>Invocation Name</li>
<li>Intents, Samples and Slots</li>
<li>Build Model</li>
<li>Endpoint</li>
</ul>
<p><img src="/images/skill-configuration.png" alt="configuring the skill" style="width:750px;border-style:solid;border-width:5px;" /></p>
<h3 id="invocation-name">Invocation Name</h3>
<p>Tap the invocation section to bring up a screen like below.</p>
<p><img src="/images/skill-invocation-name.png" alt="setting the invocation name" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>The invocation name is a short set of words that activates your skill. Think of it like a web site’s domain name. Choosing a good invocation name is an important marketing decision, because consumers will have to say “Alexa, ask [invocation name] …” to use your skill.</p>
<p>Fortunately, for demo purposes, we don’t have to worry about marketing. In the Skill Invocation Name, type “david’s bus status”, then hit the Save Model button at the top of the screen.</p>
<h3 id="intents">Intents</h3>
<p>Now is where we get into defining the type of questions our skill will answer. The basic idea of our example skill is that it will tell us when the next bus is going for a specific destination. So we might ask:</p>
<blockquote>
<p>Alexa, ask David’s bus status when is the next bus to Washington Union Station?</p>
</blockquote>
<p>or</p>
<blockquote>
<p>Alexa, ask David’s bus status when the next bus leaves for BWI Airport?</p>
</blockquote>
<p>Collectively, all questions of this type are asking the same thing. We refer to this as an “intent”. We can create a new intent to answer this type of question in the Intent section of the web portal. Click on that section, and enter the intent name of “NextBusIntent”, then click “Create Custom Intent”.</p>
<p><img src="/images/skill-add-intent.png" alt="creating a new intent" style="width:750px;border-style:solid;border-width:5px;" /></p>
<h3 id="samples">Samples</h3>
<p>A sample is a set of words with placeholders that define the actual natural language that Alexa will map to this intent. When supplying samples, you should give as many examples as you can think of, as these will be used to train Alexa’s neural network to recognize the phrases people might speak when they want this kind of information.</p>
<p>Following our examples above, we will enter two samples:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>next bus to {Destination}
next bus leaves for {Destination}
</code></pre></div></div>
<p>The curly braces enclose a placeholder called a “Slot”. In the case of this intent, there is only one slot called Destination. We’ll define the possible values for this slot later.</p>
<p>Notice that the sample utterances do not start with Alexa, do not have question words like “when is the”. Leaving these out generally gives better results as it will match a wider variety of phrasings. If you have several intents, however, you may need to add more words so that Alexa can easily discern one type of intent from the other.</p>
<p>In this case, we’ve only defined two samples. For a real skill, you probably want to have several more, to cover as many possible phrasings as you can think of.</p>
<h3 id="slots">Slots</h3>
<p>As you type in the above samples, the console will detect your slot name in the curly braces and prompt you to confirm an existing Slot Type or define a new one. Choose to create a new one as shown here:</p>
<p><img src="/images/skill-add-sample-utterance.png" alt="adding a sample utterance" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>When you’ve added all your sample utterances, the screen should look like this:</p>
<p><img src="/images/skill-add-sample-utterances.png" alt="adding sample utterances" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>You may notice that the slot type has not been defined for Destination. We’re going to define a custom slot type to hold all our favorite destinations. In the left-hand bar, you’ll see “Slot Types (0)” with a blue + Add button by it. Press that add button to create a new Slot Type.</p>
<p>We’ll define out slot type to be named Destination, and for now we’ll assign two values to it:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Washington Union Station
BWI Airport
</code></pre></div></div>
<p>A real custom slot might have dozens or hundreds of values. But let’s just keep it to these two so that things are simple. Here’s what the screen should look like:</p>
<p><img src="/images/skill-custom-slot.png" alt="defining a custom slot" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>With these steps done, you should now be able to build your Alexa model. Click the button that says Build Model. After a few seconds, you should see a successful result as indicated by a dialog like this:</p>
<p><img src="/images/skill-build-model-success.png" alt="building the model" style="width:750px;border-style:solid;border-width:5px;" /></p>
<h2 id="processing-questions">Processing Questions</h2>
<p>While we have built a skill model, it does not do anything. To make it actually answer questions, we must write the code that processes them. The software that does this is called an Endpoint. Amazon provides two ways to write the code that processes questions:</p>
<ol>
<li>An HTTPS web service</li>
<li>An AWS Lambda</li>
</ol>
<p>Whichever way you choose, the Endpoint will receive the Alexa request (broken up by intents, slots and slot values), then process it to formulate a proper response. The first option requires you to build your own web service, which can be hosted at AWS or another cloud provider, or can be running on a server sitting in your basement – all that is required is that the server have a publicly available URL.</p>
<p>The second option is to use an AWS Lambda. If you haven’t used these before, they are a surprisingly simple way of building code that runs on the internet. The basic idea is that you don’t have a web application server that is running all the time. Instead, you deploy your software function to Amazon as a lambda, and whenever somebody wants to use it, Amazon starts on instance of it on its cloud servers just to handle that single request. Once the request is handled, the lambda instance goes away. This is sometimes called “serverless” computing.</p>
<h3 id="creating-a-lambda">Creating a Lambda</h3>
<p>For this example, we’ll set up an Amazon Lambda to process the Alexa queries. This requires going into the AWS console, which is a separate login from the Amazon developer console as described before. If you don’t have an account already, you’ll have to set one up. And Amazon requires that you have a credit card on file for any generated charges. Fortunately, testing a basic Alexa skill can be done entirely with free-tier resources.</p>
<p>Now is the time we need to choose the language we will use for our Lambda. Options include various flavors of Java, Node, Python, C# and Go. The simplest Lambdas may be built by pasting a short snippet of JavaScript/Node source code into the web form. But this won’t work for Alexa skill endpoints, because these require that the Alexa Skills Kit library be bundled with the lambda source code. As a result, we actually have to set up a local build environment to generate the package to be used for the Lambda.</p>
<h4 id="making-a-javamaven-lambda-build-environment">Making a Java/Maven Lambda Build Environment</h4>
<p>I can almost hear you groaning as I write this. Don’t worry, for this example, we will keep things as simple as we possibly can so as to focus on Alexa programming and not on Java or lambda programming. We’ll build the lambda using Java 8 and package it with the Alexa Skills Kit using Maven. To proceed, you will need to install two things on your workstation: Java 8, and Apache Maven. Don’t worry if you are not a Java programmer – we will keep this very simple, and the same concepts apply to the other languages as well.</p>
<p>Once you have Java and Maven installed, download a very simple project template from <a href="https://github.com/davidgyoung/alexa-bus-status/archive/master.zip">here</a>. You’ll need to unzip the downloaded file. You can then verify it builds, by going into the project folder on the command line and running <code class="language-plaintext highlighter-rouge">mvn assembly:assembly -DdescriptorId=jar-with-dependencies package</code>. If it works, you will see the target/busstatus-1.0-jar-with-dependencies.jar get generated. If this does not work, you may need to troubleshoot your Java and Maven installation. Google is your friend.</p>
<p>Before we can actually build our lambda, <strong>you need to modify the source code to set the Alexa skill id</strong> so the lambda has permission to serve it. To do this, simply open the included BusStatusSpeechletRequestStreamHandler.java file, and change the line below to reference the proper skill identifier that you find at the developer console (https://developer.amazon.com/alexa/console/ask) by drilling down into your skill’s Endpoint section.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>supportedApplicationIds.add("amzn1.ask.skill.a780447f-537e-4b0d-8ff8-51de06655ae5");
</code></pre></div></div>
<p>Once you have edited this and saved the file, rebuild the target with <code class="language-plaintext highlighter-rouge">mvn assembly:assembly -DdescriptorId=jar-with-dependencies package</code> so we have a jar file that is ready to upload to Amazon.</p>
<h4 id="configuring-the-lambda-on-aws">Configuring the Lambda on AWS</h4>
<p>Log in to https://console.aws.amazon.com, and create a new account if needed.</p>
<p>When you get to the main menu, click on “Lambda” under the “Compute” section. This brings up the “Lambda Management Console”. You’ll see a button in the upper-right corner that says “Create function”. Click on it.</p>
<p>We will be authoring our Lambda from scratch so you can see how it works. Fortunately, that is the default option. We’ll name our Lambda “BusStatusProcessor”.</p>
<p>When defining the Lambda, we also have to select an AWS Role for it to execute under. This gets into complex AWS permission and security issues, so we’ll keep it as simple as possible so as not to go off on a huge tangent. We will choose the option to “Create new role from template(s)”, then under the Policy Templates section we’ll choose “Simple Microservice permissions”. This will provide the Lambda enough permissions to serve our question responses. If you need to make the Lambda more sophisticated later (e.g. by accessing an AWS Dynamo DB), then you may need to add more permissions.</p>
<p>We’ll name our new role “BusStatusProcessorRole” for consistency.</p>
<p>If you make all these changes to the screen, it should look like this:</p>
<p><img src="/images/skill-create-lambda.png" alt="creating the lambda" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Hit the Create Function button, and you’ll be taken to the next screen which shows the “ARN” for your lambda at the top right of the screen. This is a unique identifier you can use to hook it up to your Alexa skill. Write this number down. Yours will be different, but mine looks something like this:</p>
<p><code class="language-plaintext highlighter-rouge">ARN - arn:aws:lambda:us-east-1:084043463512:function:BusStatusProcessor</code></p>
<h3 id="linking-the-lambda-and-alexa-skill-together">Linking the Lambda and Alexa Skill Together</h3>
<p>Now go back in to the lambda configuration and look at the “Designer Section”. Tap the “Alexa Skills Kit” from the “Add triggers” section in the Designer section’s top left. This will open up a “Configure triggers” section on the bottom of the page to configure this trigger. You need to set the “Skill ID” to the identifier of your skill from the developer console. Yours will be different, but mine looks like this:</p>
<p><code class="language-plaintext highlighter-rouge">amzn1.ask.skill.a780447f-537e-4b0d-8ff8-51de06655ae5</code></p>
<p>Again, <strong>paste YOUR value (not mine above) into the Skill ID blank on this screen</strong>, then tap Add and Save. The screenshot below shows what you should see before you hit Add and Save.</p>
<p><img src="/images/skill-link-to-lambda-1.png" alt="linking" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Now we need to do tell the Skill where to find our Lambda. Go back to the Amazon developer portal skill configuration page, and click the Endpoint entry section on your Alexa Skill. Choose the Lambda endpoint option, then paste in the ARN of your Lambda from the previous section. Again, paste YOUR Lambda ARN, not mine shown above. Yours will have a different number but otherwise look similar. Before you submit your form, it should look something like this:</p>
<p><img src="/images/skill-link-to-lambda-2.png" alt="linking to the lambda" style="width:750px;border-style:solid;border-width:5px;" /></p>
<h3 id="adding-code-to-the-lambda">Adding Code to the Lambda</h3>
<p>Remember that Java code we wrote? Now it’s time to add this to the Lambda. Go back to the edit lambda page on the AWS console.</p>
<p>Look under the “Function code” section. Make sure that Code entry type is “Upload a .ZIP or JAR file”, the Runtime is set to “Java 8”. In the Handler blank, enter “com.davidgyoungtech.alexa.busstatus.BusStatusSpeechletRequestStreamHandler”. The latter entry tells the lambda which Java class inside the JAR file we are about to upload is where it finds the entry point for executing the Lambda.</p>
<p>Click the “Upload” button, and browse to the location of the busstatus-1.0-jar-with-dependencies.jar file you generated above. Hit the SAVE button at the top of the screen, and it will upload and deploy this file. Note that the JAR file is pretty big (about 7Mb) because it is bundled with the Alexa Skills Kit binary SDK which inflates its size. This file size can slow down the code/build/upload/test cycle, so it would be nice if Amazon would some day offer an option to add this SDK automatically, so we don’t have to waste bandwidth uploading it over and over again. If you know anybody at Amazon, please bug them about fixing this.</p>
<h3 id="extending-the-lambda-timeout-and-memory">Extending the Lambda Timeout and Memory</h3>
<p>By default, lambdas can only run for 3 seconds before they get terminated. In practice, this is way too short to do anything since it takes a few seconds just to spin up the Lambda the first time, and it can be super frustrating in development to troubleshoot timeouts. Scroll down on the page to where it ways “Timeout Info” and change the timeout from 3 seconds to 5 minutes. Also change the memory to 512 Mb – the Alexa Skills Kit is memory hungry. Tap save to persist these changes.</p>
<h3 id="testing-the-skill">Testing the Skill</h3>
<p>We can now do a first test of our skill. On the Amazon developer console skill configuration page, click the test tab on the top bar of the screen. When the test screen opens, you’ll see there is a switch at the top that lets you enable and disable testing for the skill. This will be disabled, by default, so change it to be enabled.</p>
<p><img src="/images/skill-first-test.png" alt="enabling testing" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>Next type in “use david’s bus status” into the blank at the top of the page (or if you are feeling ambitious, hit the microphone button and very clearly say the same thing.) If all goes well, you’ll see the skill respond with “Welcome to the bus status skill.” like in the screenshot below:</p>
<p><img src="/images/skill-first-test-result.png" alt="running the test" style="width:750px;border-style:solid;border-width:5px;" /></p>
<p>If you don’t see this, move on to the next section on debugging.</p>
<h2 id="debugging-alexa-skills">Debugging Alexa Skills</h2>
<p>If you are lucky, your test in the previous section went as expected. But if you got one little thing wrong, you’re likely to have gotten the dreaded Alexa response, “There was a problem with the requested skill’s response”. What does that mean? Unfortunately, it can mean almost anything.</p>
<p>Like in many types of programming, the easiest thing to check is the log. But with an Alexa skill, where is the log? By default, all log lines in your lambda program go to an AWS Cloudwatch. You can bring up the cloudwatch console in AWS by navigating from the main AWS menu, or going to this URL: https://console.aws.amazon.com/cloudwatch</p>
<p>In the left menu, choose Logs, and when you do you’ll see a list of the “Log Groups”, one of which should be “/aws/lambda/BusStatusProcessor”. If you don’t see this, then this means that nothing was logged, likely meaning your lambda was never even invoked. Go back and check all the settings to make sure your skill and your lambda are linked together properly.</p>
<p>If you do see the desired, log group, click on it, and you’ll be shown a number of log files you can view. Usually you’ll want to look at the one with the newest timestamp. Click on that, and you’ll see lines like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>22:09:47 START RequestId: 8976ac26-44e7-11e8-a73d-f5771e6e3049 Version: $LATEST
22:09:49 2018-04-20 22:09:49 <8976ac26-44e7-11e8-a73d-f5771e6e3049> INFO BusStatusSpeechlet:24 - onSessionStarted requestId=amzn1.echo-api.request.fb4ea6cd-206c-450c-a565-8c7e263807a5, sessionId=amzn1.echo-api.session.be934956-7328-443f-9f4c-4accc76ec7f3
22:09:49 2018-04-20 22:09:49 <8976ac26-44e7-11e8-a73d-f5771e6e3049> INFO BusStatusSpeechlet:30 - onLaunch requestId=amzn1.echo-api.request.fb4ea6cd-206c-450c-a565-8c7e263807a5, sessionId=amzn1.echo-api.session.be934956-7328-443f-9f4c-4accc76ec7f3
22:09:50 Unresolved compilation problem: Syntax error, insert ";" to complete ReturnStatement : java.lang.Error java.lang.Error: Unresolved compilation problem: Syntax error, insert ";" to complete ReturnStatement at com.davidgyoungtech.alexa.busstatus.BusStatusSpeechlet.getTallResponse(BusStatusSpeechlet.java:76) at com.davidgyoungtech.alexa.busstatus.BusStatusSpeechlet.onLaunch(BusStatusSpeechlet.
22:09:51 END RequestId: 8976ac26-44e7-11e8-a73d-f5771e6e3049
22:09:51 REPORT RequestId: 8976ac26-44e7-11e8-a73d-f5771e6e3049 Duration: 3770.78 ms Billed Duration: 3800 ms Memory Size: 512 MB Max Memory Used: 82 MB
22:09:52 START RequestId: 8c7c23f9-44e7-11e8-9dcf-dffb5b20a0ab Version: $LATEST
22:09:52 2018-04-20 22:09:52 <8c7c23f9-44e7-11e8-9dcf-dffb5b20a0ab> INFO BusStatusSpeechlet:53 - onSessionEnded requestId=amzn1.echo-api.request.97a66671-7460-42a0-9ece-3201946b7b20, sessionId=amzn1.echo-api.session.be934956-7328-443f-9f4c-4accc76ec7f3
22:09:52 END RequestId: 8c7c23f9-44e7-11e8-9dcf-dffb5b20a0ab
22:09:52 REPORT RequestId: 8c7c23f9-44e7-11e8-9dcf-dffb5b20a0ab Duration: 203.23 ms Billed Duration: 300 ms Memory Size: 512 MB Max Memory Used: 85 MB
</code></pre></div></div>
<p>In my case, I have an “unresolved compilation problem”, something that is easy to fix. Whatever problem you have, you’ll have to think hard and look for clues like any programmer must do when debugging with log files. Once again, Google is your friend.</p>
<h2 id="understanding-the-skill-lifecycle">Understanding the Skill Lifecycle</h2>
<p>Before we go further, let’s take a look at the lambda template, so we can understand how lambdas work with Alexa skills. Again, don’t worry if you are not a Java programmer, the conscepts we will describe are universal to programming Alexa lambdas with all languages.</p>
<p>Here’s what our boilerplate lambda looks like:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>public class BusStatusSpeechlet implements SpeechletV2 {
private static final Logger log = LoggerFactory.getLogger(BusStatusSpeechlet.class);
@Override
public void onSessionStarted(SpeechletRequestEnvelope<SessionStartedRequest> requestEnvelope) {
log.info("onSessionStarted requestId={}, sessionId={}", requestEnvelope.getRequest().getRequestId(),
requestEnvelope.getSession().getSessionId());
}
@Override
public SpeechletResponse onLaunch(SpeechletRequestEnvelope<LaunchRequest> requestEnvelope) {
log.info("onLaunch requestId={}, sessionId={}", requestEnvelope.getRequest().getRequestId(),
requestEnvelope.getSession().getSessionId());
return getTallResponse("Welcome to the bus status skill.");
}
@Override
public SpeechletResponse onIntent(SpeechletRequestEnvelope<IntentRequest> requestEnvelope) {
IntentRequest request = requestEnvelope.getRequest();
Session session = requestEnvelope.getSession();
log.info("onIntent requestId={}, sessionId={}", request.getRequestId(),
session.getSessionId());
Intent intent = request.getIntent();
if ("BusStatusIntent".equals(intent.getName())) {
return getAskResponse("You have triggered the bus status intent. I have no answer for you yet.");
}
else {
throw new IllegalArgumentException("Unrecognized intent: " + intent.getName());
}
}
@Override
public void onSessionEnded(SpeechletRequestEnvelope<SessionEndedRequest> requestEnvelope) {
log.info("onSessionEnded requestId={}, sessionId={}", requestEnvelope.getRequest().getRequestId(),
requestEnvelope.getSession().getSessionId());
}
</code></pre></div></div>
<p>There are four methods here, each of which corresponds to an event in the Alexa lifecycle:</p>
<ul>
<li>OnLaunch - this is called when we activate our skill: “Use David’s Bus Status”</li>
<li>OnIntent - this is called when a user asks a question. “When is the next bus?”</li>
<li>OnSessionStarted - this is called when a user’s conversation begins. It is typically right before OnLaunch or OnIntent.</li>
<li>OnSessionEnded - this is called when a user session times out or is ended programmatically. It basically means that saved state is lost.</li>
</ul>
<p>Note that these lifecycle events can be triggered one after the other, automatically. If we type into our skill tester, “use david’s bus status to find the next bus to bwi”, two events fire in order: OnSesisonStarted and OnIntent. Based on the code above, the system responds with “You have triggered the bus status intent. I have no answer for you yet.”</p>
<p>By far the most important event above is onIntent. This is where you’ll do most of your work in answering questions. These same event handlers apply to buildling Alexa skills with any language. So if you are going to build in with JavaScript/Node, Go, or Python, the same concepts apply.</p>
<h2 id="adding-logic">Adding Logic</h2>
<p>So far Alexa skill does not do much besides acknowledge it has been invoked. How do we make it tell us when the next bus is? Since this isn’t a tutorial on Java or web services programming, we’ll keep things very simple so as not to distract from the core subject of building Alexa Skills. A robust bus schedule app would probably call out to a web service to get the next bus arrival time for a particular bus stop. Open standards like GTFS let you access transit schedules from many municipalities. Here’s the page to access GTFS feeds for Washington, DC: https://transitfeeds.com/p/wmata/75</p>
<p>So the code is easily understandable, we’ll simply store our bus schedule in an array, and search through it for a match when the skill is invoked.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> String[][] dailyBusSchedule = new String[][] {
{ "BWI Airport", "07:30" },
{ "BWI Airport", "08:30" },
{ "BWI Airport", "12:30" },
{ "BWI Airport", "15:30" },
{ "BWI Airport", "20:45" },
{ "Washington Union Station", "07:10" },
{ "Washington Union Station", "08:40" },
{ "Washington Union Station", "09:10" },
{ "Washington Union Station", "10:40" },
{ "Washington Union Station", "11:10" },
{ "Washington Union Station", "12:40" },
{ "Washington Union Station", "13:10" }
};
</code></pre></div></div>
<p>As you can see from the above, this is a two dimensional array where each row is a schedule entry and the columns are the destination and the departure time. Everything is stored as a string. Clearly we could make a much fancier data structure (or even a database), but we’re keeping things really, really simple.</p>
<h3 id="accessing-slots-from-intents">Accessing Slots from Intents</h3>
<p>Remember that “intent” is fancy Alexa lingo for a question, so we’ll be putting our code inside the OnIntent method. Remember when we defined our questions before? The place holders for the values in the questions are called Slots. In our case, the slot we care about is the bus destination, and when we defined it we gave it the slot identifier of Destination.</p>
<p>We can get access to the slot from the intent object, which is passed in to the OnIntent method:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>Slot slot = intent.getSlots().get("Destination");
</code></pre></div></div>
<p>If our slot is null, then that means it was not provided in the question. If it is not null, we can get the value of the slot with</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>String destination = slot.getValue();
</code></pre></div></div>
<h3 id="coding-onintent">Coding OnIntent</h3>
<p>Now that we have the destination, we can iterate through our table to find the closest match. The logic is pretty straightforward, but for those of you unfamiliar with Java, here are two explanations:</p>
<ul>
<li>The lines below are a way to get the current time as a String in hours and minutes. We do this in 24 hour time format so we can do string comparisons of the times and have the > and < operations work on a string just as if it were a number. Note that I am explicitly setting the time zone to eastern time, otherwise Alexa would give answers as if the time were UTC. You may wish to change this for your time zone, or add fancy logic to adjust for the user’s time zone.
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>SimpleDateFormat timeFormatter = new SimpleDateFormat("HH:mm");
timeFormatter.setTimeZone(TimeZone.getTimeZone("US/Eastern"));
String now = timeFormatter.format(new Date());
</code></pre></div> </div>
</li>
<li>The line below compares the scheduedDepartureTime string to the now time string and returns a positive value if the scheduledDepartureTime string comes after the now string. A value > 0 basically tells us the time is in the future.</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>scheduleDepartureTime.compareTo(now) > 0
</code></pre></div></div>
<p>With those items in mind, here is the simple algorithm I put together:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> public SpeechletResponse onIntent(SpeechletRequestEnvelope<IntentRequest> requestEnvelope) {
IntentRequest request = requestEnvelope.getRequest();
Session session = requestEnvelope.getSession();
log.info("onIntent requestId={}, sessionId={}", request.getRequestId(),
session.getSessionId());
Intent intent = request.getIntent();
if ("BusStatusIntent".equals(intent.getName())) {
Slot slot = intent.getSlots().get("Destination");
String firstDailyDepartureTime = null;
String nextDepartureTime = null;
SimpleDateFormat timeFormatter = new SimpleDateFormat("HH:mm");
timeFormatter.setTimeZone(TimeZone.getTimeZone("US/Eastern"));
String now = timeFormatter.format(new Date());
if (slot != null) {
String destination = slot.getValue();
for (String[] scheduleItem : dailyBusSchedule) {
String scheduleDestination = scheduleItem[0];
String scheduleDepartureTime = scheduleItem[1];
if (scheduleDestination.equalsIgnoreCase(destination)) {
if (firstDailyDepartureTime == null) {
firstDailyDepartureTime = scheduleDepartureTime;
}
// If nextDepartureTime is not yet set and the
// scheduleDeparture time is in the future use it is
// the nextDepartureTime
if (nextDepartureTime == null &&
scheduleDepartureTime.compareTo(now) > 0) {
nextDepartureTime = scheduleDepartureTime;
}
}
}
if (firstDailyDepartureTime == null) {
return getAskResponse("I don't know about departures to "+destination+". Try a different destination.");
}
else if (nextDepartureTime == null) {
return getAskResponse("The next bus for "+destination+" is tomorrow at "+firstDailyDepartureTime);
}
else {
return getAskResponse("The next bus for "+destination+" is today at "+nextDepartureTime);
}
}
return getAskResponse("Please specify a destination.");
}
else {
throw new IllegalArgumentException("Unrecognized intent: " + intent.getName());
}
}
</code></pre></div></div>
<p>In the code above, two values are calculated: nextDepartureTime and firstDailyDepartureTime. If there is no nextDepartureTime for a destination, then that means there are no more busses today for that destination. If there is no firstDailyDepartureTime, then that means there aren’t any busses at all for the requested destination.</p>
<p>The three combinations of these two variables being populated or not give us our three possible text responses shown with <code class="language-plaintext highlighter-rouge">getAskResponse</code> calls. And as you’ve probably figured out, Alexa will automatically convert our text responses to speech f for us!</p>
<h3 id="running-the-final-test">Running the Final Test</h3>
<p>If you’ve made the above changes, you’ll need to recompile your code and upload it to the lambda in the AWS console like we did before so it will execute. If you do that, you can return to the Skill Tester and type in questions:</p>
<blockquote>
<p>Q: ask david’s bus status when is the next bus to bwi airport</p>
<p>A: The next bus for bwi airport is today at 20:45</p>
<p>Q: ask david’s bus status when is the next bus to washington union station</p>
<p>A: The next bus for Washington union station is tomorrow at 07:10</p>
</blockquote>
<p>Your answers will vary depending on the time of day you ask and what time zone you are in. Remember, the code assumes you are in US Eastern Time.</p>
<h2 id="going-forward">Going Forward</h2>
<p>This tutorial has shown you how to get a basic Alexa skill running. From this basic template, you can rework what you have for your own use case.</p>
<p>After lots of testing, you’ll certainly want to share it with your colleagues for them to try, too. Amazon lets you set up a beta test where you can invite people to use the skill on real Amazon Echo devices. Once you are happy with your product, you’ll need to submit your skill to Amazon for review, much like an iOS app get reviewed by Apple before it is allowed in the Apple App Store.</p>
<p>Before you get there, you’re sure to find plenty of idiosyncracies with the Alexa toolkit, and unfortunately, it’s still a bit too niche of a technology to find easy help. If you’re stuck, search questions on StackOverflow.com, or post there or on Amazon’s Alexa Skills Kit developer forum here: https://forums.developer.amazon.com/topics/alexa+skills+kit.html</p>
<p>If you have issues with this tutorial, please create an issue in the sample skill’s Github repo <a href="https://github.com/davidgyoung/alexa-bus-status/issues">here</a>.</p>
<p>If you get totally stuck, tell your boss to hire me to help. I may not know the answer to your problem right away, but I’m always eager to get the job done.</p>
Native Bar Code Scanning in iOS2018-01-24T00:00:00+00:00http://davidgyoungtech.com/2018/01/24/native-bar-code-scanning-in-ios<p>Folks using the camera on iOS 11 may have discovered that it now allows you to scan QR codes without any additional apps. Just point the camera app at a QR code, and you get a pop up asking you if you want to visit the code’s URL in Safari.</p>
<p>While this is a nifty shortcut, it highlights powerful bar code scanning capabilities that have been built into iOS since version 7. Using AVFoundation, you can not only scan QR codes but quite a large variety of industry bar code formats. This is good news, because it greatly expands the use case of this technology beyond the once ubiquitous QR codes that proliferated early the mobile phone era.</p>
<p>This article will show you how easy it is to add bar code scanning to your app. You can follow along below or get a head start by cloning the repo <a href="https://github.com/davidgyoung/bar-code-scanner">here</a>.</p>
<p>But first a little Q&A:</p>
<h3 id="what-kind-of-bar-codes-can-avfoundation-scan">What kind of bar codes can AVFoundation scan?</h3>
<p>Almost anything. Support includes a wide variety of bar code formats (both 1-dimensional ones like supermarket UPC codes as well as newer 2-dimensional variants) as well as QR codes. You can see a full list of the types in the <a href="https://developer.apple.com/documentation/avfoundation/avmetadataobject.objecttype">AVMetadataObject.ObjectType</a> enumeration.</p>
<p>You can scan codes in the supermarket, scan airline or train tickets, or almost anything that is designed to be machine readable.</p>
<h3 id="how-well-does-it-work">How well does it work?</h3>
<p>The ability do detect and decode the bar codes is quite good, but there are some caveats:</p>
<ul>
<li>If you hold the camera at an angle relative to the bar code, iOS may not detect it.</li>
<li>In my testing, holding the phone in landscape mode (the same orientation as the bar codes I was scanning) provided faster detection.</li>
<li>The speed at which it detects partly depends on how quickly the camera on your iOS device brings the bar code image into focus. If you hold the camera too close to the bar code (e.g. an inch or less), it may never come into focus at all.</li>
<li>Lighting conditions greatly affect the success rate. Brighter light is better.</li>
</ul>
<p>With practice holding the camera and in good lighting conditions, I have found that I can typically scan a bar code in one to three seconds. Just don’t expect the speed and successful detection rate to be as high as a dedicated hardware laser scanner.</p>
<h3 id="setting-it-up">Setting it up</h3>
<p>To set this up, you use the same AVCaptureSession class that has been around since iOS 4 to take pictures and videos.</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="k">var</span> <span class="nv">captureSession</span> <span class="o">=</span> <span class="kt">AVCaptureSession</span><span class="p">()</span></code></pre></figure>
<p>You then use the AVCaptureDevice class to find the camera you want to use (usually the rear camera), and attach it to the captureSession Like this:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="k">let</span> <span class="nv">videoInput</span> <span class="o">=</span> <span class="k">try</span> <span class="kt">AVCaptureDeviceInput</span><span class="p">(</span><span class="nv">device</span><span class="p">:</span> <span class="n">captureDevice</span><span class="p">)</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">addInput</span><span class="p">(</span><span class="n">videoInput</span><span class="p">)</span></code></pre></figure>
<p>You next add a hook to extract metadata from the video. This uses the AVCaptureMetadataOutput class. It has a delegate method you use to return the captured bar code data.</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="k">let</span> <span class="nv">captureMetadataOutput</span> <span class="o">=</span> <span class="kt">AVCaptureMetadataOutput</span><span class="p">()</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">addOutput</span><span class="p">(</span><span class="n">captureMetadataOutput</span><span class="p">)</span>
<span class="n">captureMetadataOutput</span><span class="o">.</span><span class="nf">setMetadataObjectsDelegate</span><span class="p">(</span><span class="k">self</span><span class="p">,</span> <span class="nv">queue</span><span class="p">:</span> <span class="kt">DispatchQueue</span><span class="o">.</span><span class="n">main</span><span class="p">)</span></code></pre></figure>
<p>The delegate callback comes in the AVCaptureMetadataOutputObjectsDelegate
protocol. It only has one method that looks like below in Swift 4. (Caution: If using Swift 3.x the delegate method is different, so it won’t get called if defined like below.)</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="kd">func</span> <span class="nf">metadataOutput</span><span class="p">(</span><span class="n">_</span> <span class="nv">output</span><span class="p">:</span> <span class="kt">AVCaptureMetadataOutput</span><span class="p">,</span> <span class="n">didOutput</span> <span class="nv">metadataObjects</span><span class="p">:</span> <span class="p">[</span><span class="kt">AVMetadataObject</span><span class="p">],</span>
<span class="n">from</span> <span class="nv">connection</span><span class="p">:</span> <span class="kt">AVCaptureConnection</span><span class="p">)</span> <span class="p">{</span>
<span class="c1">// TODO: populate this method later</span>
<span class="p">}</span></code></pre></figure>
<p>You then tell the AVCaptureMetadataOutput which bar code types you want to capture. For our example, we’ll try to capture every type it knows about.</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="n">captureMetadataOutput</span><span class="o">.</span><span class="n">metadataObjectTypes</span> <span class="o">=</span> <span class="n">barCodeTypes</span></code></pre></figure>
<p>Finally, you start the capture session with:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="n">captureSession</span><span class="o">.</span><span class="nf">startRunning</span><span class="p">()</span></code></pre></figure>
<p>The above steps are enough to get callbacks to the delegate method every time the camera points to a recognized bar code. However, a few things are missing:</p>
<ol>
<li>On iOS, you must get special permission for an app to access the camera, otherwise the app will crash or fail to get access to the camera device.</li>
<li>When using the app, you can’t see what you are pointing at, because we aren’t displaying the camera view.</li>
<li>We don’t do anything with the results of the scans.</li>
</ol>
<p>Let’s tackle those issues one at a time, starting with the permissions issue. Without that solved, we can’t even run the app!</p>
<p>Before anything else, you must add a new key/value into the app’s Info.plist file. This entry declares that the app wants to use the camera and supplies the user with a prompt message explaining why.</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="o"><</span><span class="n">key</span><span class="o">></span><span class="kt">NSCameraUsageDescription</span><span class="o"></</span><span class="n">key</span><span class="o">></span>
<span class="o"><</span><span class="n">string</span><span class="o">></span><span class="kt">Camera</span> <span class="n">access</span> <span class="n">needed</span> <span class="n">to</span> <span class="n">scan</span> <span class="n">bar</span> <span class="n">codes</span><span class="o">.</</span><span class="n">string</span><span class="o">></span></code></pre></figure>
<p>Before trying to access the camera, you check to see if this permission has been granted already, and if not, ask the user for it:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="k">let</span> <span class="nv">authorizationStatus</span> <span class="o">=</span> <span class="kt">AVCaptureDevice</span><span class="o">.</span><span class="nf">authorizationStatus</span><span class="p">(</span><span class="nv">for</span><span class="p">:</span> <span class="o">.</span><span class="n">video</span><span class="p">)</span>
<span class="k">if</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">notDetermined</span> <span class="p">{</span>
<span class="c1">// permission dialog not yet presented, request authorization</span>
<span class="n">accessRequested</span> <span class="o">=</span> <span class="kc">true</span>
<span class="kt">AVCaptureDevice</span><span class="o">.</span><span class="nf">requestAccess</span><span class="p">(</span><span class="nv">for</span><span class="p">:</span> <span class="o">.</span><span class="n">video</span><span class="p">,</span>
<span class="nv">completionHandler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="nv">granted</span><span class="p">:</span><span class="kt">Bool</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="k">self</span><span class="o">.</span><span class="nf">setupCapture</span><span class="p">();</span>
<span class="p">})</span>
<span class="k">return</span>
<span class="p">}</span>
<span class="k">if</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">restricted</span> <span class="o">||</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">denied</span> <span class="p">{</span>
<span class="n">accessDenied</span> <span class="o">=</span> <span class="kc">true</span>
<span class="p">}</span></code></pre></figure>
<p>In the code above, I set a couple of flags called accessDenied and accessRequested, so we can use these later to know what is going on if we cannot get access to the camera. We can then present a dialog to the user explaining why. Since this isn’t core to this exercise, I won’t show the details here. But you can see how the dialogs are presented in the full code in “Extra credit section 1”.</p>
<p>The setupCapture method defined referenced above will have the full code needed to set up the bar code capture. This code must include this permissions checking, but we will recursively call it again after the permission is granted so we can set up the capture again if the user authorized camera access.</p>
<p><img src="/images/scanpermission.png" width="320px" /></p>
<p>With that done, we can move on to the second missing item, showing on the screen what the camera is seeing. Doing this is pretty simple. We construct a videoPreviewLayer with the capture session, and make it a subview of our view. Like this:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="n">videoPreviewLayer</span> <span class="o">=</span> <span class="kt">AVCaptureVideoPreviewLayer</span><span class="p">(</span><span class="nv">session</span><span class="p">:</span> <span class="n">captureSession</span><span class="p">)</span>
<span class="n">videoPreviewLayer</span><span class="p">?</span><span class="o">.</span><span class="n">videoGravity</span> <span class="o">=</span> <span class="o">.</span><span class="n">resizeAspectFill</span>
<span class="n">videoPreviewLayer</span><span class="p">?</span><span class="o">.</span><span class="n">frame</span> <span class="o">=</span> <span class="n">view</span><span class="o">.</span><span class="n">layer</span><span class="o">.</span><span class="n">bounds</span>
<span class="n">view</span><span class="o">.</span><span class="n">layer</span><span class="o">.</span><span class="nf">addSublayer</span><span class="p">(</span><span class="n">videoPreviewLayer</span><span class="o">!</span><span class="p">)</span></code></pre></figure>
<p>With this in place, our full setupCapture method looks like this:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="kd">func</span> <span class="nf">setupCapture</span><span class="p">()</span> <span class="p">{</span>
<span class="k">var</span> <span class="nv">success</span> <span class="o">=</span> <span class="kc">false</span>
<span class="k">var</span> <span class="nv">accessDenied</span> <span class="o">=</span> <span class="kc">false</span>
<span class="k">var</span> <span class="nv">accessRequested</span> <span class="o">=</span> <span class="kc">false</span>
<span class="k">if</span> <span class="k">let</span> <span class="nv">barCodeFrameView</span> <span class="o">=</span> <span class="n">barCodeFrameView</span> <span class="p">{</span>
<span class="n">barCodeFrameView</span><span class="o">.</span><span class="nf">removeFromSuperview</span><span class="p">()</span>
<span class="k">self</span><span class="o">.</span><span class="n">barCodeFrameView</span> <span class="o">=</span> <span class="kc">nil</span>
<span class="p">}</span>
<span class="k">let</span> <span class="nv">authorizationStatus</span> <span class="o">=</span> <span class="kt">AVCaptureDevice</span><span class="o">.</span><span class="nf">authorizationStatus</span><span class="p">(</span><span class="nv">for</span><span class="p">:</span> <span class="o">.</span><span class="n">video</span><span class="p">)</span>
<span class="k">if</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">notDetermined</span> <span class="p">{</span>
<span class="c1">// permission dialog not yet presented, request authorization</span>
<span class="n">accessRequested</span> <span class="o">=</span> <span class="kc">true</span>
<span class="kt">AVCaptureDevice</span><span class="o">.</span><span class="nf">requestAccess</span><span class="p">(</span><span class="nv">for</span><span class="p">:</span> <span class="o">.</span><span class="n">video</span><span class="p">,</span>
<span class="nv">completionHandler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="nv">granted</span><span class="p">:</span><span class="kt">Bool</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="k">self</span><span class="o">.</span><span class="nf">setupCapture</span><span class="p">();</span>
<span class="p">})</span>
<span class="k">return</span>
<span class="p">}</span>
<span class="k">if</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">restricted</span> <span class="o">||</span> <span class="n">authorizationStatus</span> <span class="o">==</span> <span class="o">.</span><span class="n">denied</span> <span class="p">{</span>
<span class="n">accessDenied</span> <span class="o">=</span> <span class="kc">true</span>
<span class="p">}</span>
<span class="k">if</span> <span class="n">initialized</span> <span class="p">{</span>
<span class="n">success</span> <span class="o">=</span> <span class="kc">true</span>
<span class="p">}</span>
<span class="k">else</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">deviceDiscoverySession</span> <span class="o">=</span> <span class="kt">AVCaptureDevice</span><span class="o">.</span><span class="kt">DiscoverySession</span><span class="p">(</span><span class="nv">deviceTypes</span><span class="p">:</span>
<span class="p">[</span><span class="o">.</span><span class="n">builtInWideAngleCamera</span><span class="p">,</span>
<span class="o">.</span><span class="n">builtInTelephotoCamera</span><span class="p">,</span>
<span class="o">.</span><span class="n">builtInDualCamera</span><span class="p">],</span>
<span class="nv">mediaType</span><span class="p">:</span> <span class="kt">AVMediaType</span><span class="o">.</span><span class="n">video</span><span class="p">,</span>
<span class="nv">position</span><span class="p">:</span> <span class="o">.</span><span class="n">unspecified</span><span class="p">)</span>
<span class="k">if</span> <span class="k">let</span> <span class="nv">captureDevice</span> <span class="o">=</span> <span class="n">deviceDiscoverySession</span><span class="o">.</span><span class="n">devices</span><span class="o">.</span><span class="n">first</span> <span class="p">{</span>
<span class="k">do</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">videoInput</span> <span class="o">=</span> <span class="k">try</span> <span class="kt">AVCaptureDeviceInput</span><span class="p">(</span><span class="nv">device</span><span class="p">:</span> <span class="n">captureDevice</span><span class="p">)</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">addInput</span><span class="p">(</span><span class="n">videoInput</span><span class="p">)</span>
<span class="n">success</span> <span class="o">=</span> <span class="kc">true</span>
<span class="p">}</span> <span class="k">catch</span> <span class="p">{</span>
<span class="kt">NSLog</span><span class="p">(</span><span class="s">"Cannot construct capture device input"</span><span class="p">)</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="k">else</span> <span class="p">{</span>
<span class="kt">NSLog</span><span class="p">(</span><span class="s">"Cannot get capture device"</span><span class="p">)</span>
<span class="p">}</span>
<span class="k">if</span> <span class="n">success</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">captureMetadataOutput</span> <span class="o">=</span> <span class="kt">AVCaptureMetadataOutput</span><span class="p">()</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">addOutput</span><span class="p">(</span><span class="n">captureMetadataOutput</span><span class="p">)</span>
<span class="k">let</span> <span class="nv">newSerialQueue</span> <span class="o">=</span> <span class="kt">DispatchQueue</span><span class="p">(</span><span class="nv">label</span><span class="p">:</span> <span class="s">"barCodeScannerQueue"</span><span class="p">)</span>
<span class="n">captureMetadataOutput</span><span class="o">.</span><span class="nf">setMetadataObjectsDelegate</span><span class="p">(</span><span class="k">self</span><span class="p">,</span> <span class="nv">queue</span><span class="p">:</span> <span class="n">newSerialQueue</span><span class="p">)</span>
<span class="n">captureMetadataOutput</span><span class="o">.</span><span class="n">metadataObjectTypes</span> <span class="o">=</span> <span class="n">barCodeTypes</span>
<span class="n">videoPreviewLayer</span> <span class="o">=</span> <span class="kt">AVCaptureVideoPreviewLayer</span><span class="p">(</span><span class="nv">session</span><span class="p">:</span> <span class="n">captureSession</span><span class="p">)</span>
<span class="n">videoPreviewLayer</span><span class="p">?</span><span class="o">.</span><span class="n">videoGravity</span> <span class="o">=</span> <span class="o">.</span><span class="n">resizeAspectFill</span>
<span class="n">videoPreviewLayer</span><span class="p">?</span><span class="o">.</span><span class="n">frame</span> <span class="o">=</span> <span class="n">view</span><span class="o">.</span><span class="n">layer</span><span class="o">.</span><span class="n">bounds</span>
<span class="n">view</span><span class="o">.</span><span class="n">layer</span><span class="o">.</span><span class="nf">addSublayer</span><span class="p">(</span><span class="n">videoPreviewLayer</span><span class="o">!</span><span class="p">)</span>
<span class="n">initialized</span> <span class="o">=</span> <span class="kc">true</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="k">if</span> <span class="n">success</span> <span class="p">{</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">startRunning</span><span class="p">()</span>
<span class="p">}</span>
<span class="c1">// ----------------------</span>
<span class="c1">// Extra credit section 1</span>
<span class="c1">// If we cannot establish a camera capture session for some reason, </span>
<span class="c1">// show a dialog to the user explaining why</span>
<span class="c1">// ----------------------</span>
<span class="k">if</span> <span class="o">!</span><span class="n">success</span> <span class="p">{</span>
<span class="c1">// Only show a dialog if we have not just asked the user for permission to use the </span>
<span class="c1">// camera. Asking permission sends its own dialog to th user</span>
<span class="k">if</span> <span class="o">!</span><span class="n">accessRequested</span> <span class="p">{</span>
<span class="c1">// Generic message if we cannot figure out why we cannot establish a camera session</span>
<span class="k">var</span> <span class="nv">message</span> <span class="o">=</span> <span class="s">"Cannot access camera to scan bar codes"</span>
<span class="cp">#if (arch(i386) || arch(x86_64)) && (!os(macOS))</span>
<span class="n">message</span> <span class="o">=</span> <span class="s">"You are running on the simulator, which does not hae a camera device."</span><span class="o">+</span>
<span class="s">" Try this on a real iOS device."</span>
<span class="cp">#endif</span>
<span class="k">if</span> <span class="n">accessDenied</span> <span class="p">{</span>
<span class="n">message</span> <span class="o">=</span> <span class="s">"You have denied this app permission to access to the camera."</span><span class="o">+</span>
<span class="s">"Please go to settings and enable camera access permission to"</span><span class="o">+</span>
<span class="s">" be able to scan bar codes"</span>
<span class="p">}</span>
<span class="k">let</span> <span class="nv">alertPrompt</span> <span class="o">=</span> <span class="kt">UIAlertController</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"Cannot access camera"</span><span class="p">,</span>
<span class="nv">message</span><span class="p">:</span> <span class="n">message</span><span class="p">,</span> <span class="nv">preferredStyle</span><span class="p">:</span> <span class="o">.</span><span class="n">alert</span><span class="p">)</span>
<span class="k">let</span> <span class="nv">confirmAction</span> <span class="o">=</span> <span class="kt">UIAlertAction</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"OK"</span><span class="p">,</span> <span class="nv">style</span><span class="p">:</span> <span class="kt">UIAlertActionStyle</span><span class="o">.</span><span class="k">default</span><span class="p">,</span>
<span class="nv">handler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="n">action</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="k">self</span><span class="o">.</span><span class="n">navigationController</span><span class="p">?</span><span class="o">.</span><span class="nf">popViewController</span><span class="p">(</span><span class="nv">animated</span><span class="p">:</span> <span class="kc">true</span><span class="p">)</span>
<span class="p">})</span>
<span class="n">alertPrompt</span><span class="o">.</span><span class="nf">addAction</span><span class="p">(</span><span class="n">confirmAction</span><span class="p">)</span>
<span class="k">self</span><span class="o">.</span><span class="nf">present</span><span class="p">(</span><span class="n">alertPrompt</span><span class="p">,</span> <span class="nv">animated</span><span class="p">:</span> <span class="kc">true</span><span class="p">,</span> <span class="nv">completion</span><span class="p">:</span> <span class="p">{</span>
<span class="p">})</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="p">}</span></code></pre></figure>
<p>Our code currently doesn’t do anything if it does not find a bar code. The simplest solution is to show a dialog that displays the text encoded in the barcode. If the text is a URL (as is typically true with a QR code), then we can add a button on the dialog to launch it in Safari. Here’s the code that does that:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="kd">func</span> <span class="nf">processBarCodeData</span><span class="p">(</span><span class="nv">metadataObjects</span><span class="p">:</span> <span class="p">[</span><span class="kt">AVMetadataObject</span><span class="p">])</span> <span class="p">{</span>
<span class="k">if</span> <span class="k">let</span> <span class="nv">metadataObject</span> <span class="o">=</span> <span class="n">metadataObjects</span><span class="o">.</span><span class="n">first</span> <span class="k">as?</span> <span class="kt">AVMetadataMachineReadableCodeObject</span> <span class="p">{</span>
<span class="k">if</span> <span class="n">barCodeTypes</span><span class="o">.</span><span class="nf">contains</span><span class="p">(</span><span class="n">metadataObject</span><span class="o">.</span><span class="n">type</span><span class="p">)</span> <span class="p">{</span>
<span class="c1">// If the found metadata is equal to the QR code metadata (or barcode) then </span>
<span class="c1">// update the status label's text and set the bounds</span>
<span class="k">let</span> <span class="nv">barCodeObject</span> <span class="o">=</span> <span class="n">videoPreviewLayer</span><span class="p">?</span><span class="o">.</span><span class="nf">transformedMetadataObject</span><span class="p">(</span><span class="nv">for</span><span class="p">:</span> <span class="n">metadataObject</span><span class="p">)</span>
<span class="c1">// Initialize Frame to highlight the Bar Code</span>
<span class="k">if</span> <span class="n">metadataObject</span><span class="o">.</span><span class="n">stringValue</span> <span class="o">!=</span> <span class="kc">nil</span> <span class="p">{</span>
<span class="n">captureSession</span><span class="o">.</span><span class="nf">stopRunning</span><span class="p">()</span>
<span class="nf">displayBarCodeResult</span><span class="p">(</span><span class="nv">code</span><span class="p">:</span> <span class="n">metadataObject</span><span class="o">.</span><span class="n">stringValue</span><span class="o">!</span><span class="p">)</span>
<span class="c1">// because there might be more bar codes detected, </span>
<span class="c1">// we return from the loop early</span>
<span class="c1">// here so we do not process more than one</span>
<span class="k">return</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="p">}</span>
<span class="kd">func</span> <span class="nf">displayBarCodeResult</span><span class="p">(</span><span class="nv">code</span><span class="p">:</span> <span class="kt">String</span><span class="p">)</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">alertPrompt</span> <span class="o">=</span> <span class="kt">UIAlertController</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"Bar code detected"</span><span class="p">,</span> <span class="nv">message</span><span class="p">:</span> <span class="n">code</span><span class="p">,</span>
<span class="nv">preferredStyle</span><span class="p">:</span> <span class="o">.</span><span class="n">alert</span><span class="p">)</span>
<span class="k">if</span> <span class="k">let</span> <span class="nv">url</span> <span class="o">=</span> <span class="kt">URL</span><span class="p">(</span><span class="nv">string</span><span class="p">:</span> <span class="n">code</span><span class="p">)</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">confirmAction</span> <span class="o">=</span> <span class="kt">UIAlertAction</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"Launch URL"</span><span class="p">,</span> <span class="nv">style</span><span class="p">:</span>
<span class="kt">UIAlertActionStyle</span><span class="o">.</span><span class="k">default</span><span class="p">,</span> <span class="nv">handler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="n">action</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="kt">UIApplication</span><span class="o">.</span><span class="n">shared</span><span class="o">.</span><span class="nf">open</span><span class="p">(</span><span class="n">url</span><span class="p">,</span> <span class="nv">options</span><span class="p">:</span> <span class="p">[:],</span> <span class="nv">completionHandler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="n">result</span><span class="p">)</span> <span class="k">in</span>
<span class="k">if</span> <span class="n">result</span> <span class="p">{</span>
<span class="kt">NSLog</span><span class="p">(</span><span class="s">"opened url"</span><span class="p">)</span>
<span class="p">}</span>
<span class="k">else</span> <span class="p">{</span>
<span class="k">let</span> <span class="nv">alertPrompt</span> <span class="o">=</span> <span class="kt">UIAlertController</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"Cannot open url"</span><span class="p">,</span>
<span class="nv">message</span><span class="p">:</span> <span class="kc">nil</span><span class="p">,</span> <span class="nv">preferredStyle</span><span class="p">:</span> <span class="o">.</span><span class="n">alert</span><span class="p">)</span>
<span class="k">let</span> <span class="nv">confirmAction</span> <span class="o">=</span> <span class="kt">UIAlertAction</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"OK"</span><span class="p">,</span>
<span class="nv">style</span><span class="p">:</span> <span class="kt">UIAlertActionStyle</span><span class="o">.</span><span class="k">default</span><span class="p">,</span>
<span class="nv">handler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="n">action</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="p">})</span>
<span class="n">alertPrompt</span><span class="o">.</span><span class="nf">addAction</span><span class="p">(</span><span class="n">confirmAction</span><span class="p">)</span>
<span class="k">self</span><span class="o">.</span><span class="nf">present</span><span class="p">(</span><span class="n">alertPrompt</span><span class="p">,</span> <span class="nv">animated</span><span class="p">:</span> <span class="kc">true</span><span class="p">,</span> <span class="nv">completion</span><span class="p">:</span> <span class="p">{</span>
<span class="k">self</span><span class="o">.</span><span class="nf">setupCapture</span><span class="p">()</span>
<span class="p">})</span>
<span class="p">}</span>
<span class="p">})</span>
<span class="p">})</span>
<span class="n">alertPrompt</span><span class="o">.</span><span class="nf">addAction</span><span class="p">(</span><span class="n">confirmAction</span><span class="p">)</span>
<span class="p">}</span>
<span class="k">let</span> <span class="nv">cancelAction</span> <span class="o">=</span> <span class="kt">UIAlertAction</span><span class="p">(</span><span class="nv">title</span><span class="p">:</span> <span class="s">"Cancel"</span><span class="p">,</span> <span class="nv">style</span><span class="p">:</span> <span class="kt">UIAlertActionStyle</span><span class="o">.</span><span class="n">cancel</span><span class="p">,</span>
<span class="nv">handler</span><span class="p">:</span> <span class="p">{</span> <span class="p">(</span><span class="n">action</span><span class="p">)</span> <span class="o">-></span> <span class="kt">Void</span> <span class="k">in</span>
<span class="k">self</span><span class="o">.</span><span class="nf">setupCapture</span><span class="p">()</span>
<span class="p">})</span>
<span class="n">alertPrompt</span><span class="o">.</span><span class="nf">addAction</span><span class="p">(</span><span class="n">cancelAction</span><span class="p">)</span>
<span class="nf">present</span><span class="p">(</span><span class="n">alertPrompt</span><span class="p">,</span> <span class="nv">animated</span><span class="p">:</span> <span class="kc">true</span><span class="p">,</span> <span class="nv">completion</span><span class="p">:</span> <span class="kc">nil</span><span class="p">)</span>
<span class="p">}</span></code></pre></figure>
<p>The code above is defined in its own methods and not called from the delegate callback. You can hook that in for Swift 3 or 4 like this:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="c1">// Swift 3.x callback</span>
<span class="kd">func</span> <span class="nf">captureOutput</span><span class="p">(</span><span class="n">_</span> <span class="nv">captureOutput</span><span class="p">:</span> <span class="kt">AVCaptureOutput</span><span class="o">!</span><span class="p">,</span> <span class="n">didOutputMetadataObjects</span> <span class="nv">metadataObjects</span><span class="p">:</span> <span class="p">[</span><span class="kt">Any</span><span class="p">]</span><span class="o">!</span><span class="p">,</span>
<span class="n">from</span> <span class="nv">connection</span><span class="p">:</span> <span class="kt">AVCaptureConnection</span><span class="o">!</span><span class="p">)</span> <span class="p">{</span>
<span class="nf">processBarCodeData</span><span class="p">(</span><span class="nv">metadataObjects</span><span class="p">:</span> <span class="n">metadataObjects</span> <span class="k">as!</span> <span class="p">[</span><span class="kt">AVMetadataObject</span><span class="p">])</span>
<span class="p">}</span>
<span class="c1">// Swift 4 callback</span>
<span class="kd">func</span> <span class="nf">metadataOutput</span><span class="p">(</span><span class="n">_</span> <span class="nv">output</span><span class="p">:</span> <span class="kt">AVCaptureMetadataOutput</span><span class="p">,</span> <span class="n">didOutput</span> <span class="nv">metadataObjects</span><span class="p">:</span> <span class="p">[</span><span class="kt">AVMetadataObject</span><span class="p">],</span>
<span class="n">from</span> <span class="nv">connection</span><span class="p">:</span> <span class="kt">AVCaptureConnection</span><span class="p">)</span> <span class="p">{</span>
<span class="nf">processBarCodeData</span><span class="p">(</span><span class="nv">metadataObjects</span><span class="p">:</span> <span class="n">metadataObjects</span><span class="p">)</span>
<span class="p">}</span></code></pre></figure>
<p><img src="/images/scanresult.png" width="320px" /></p>
<p>If you put all of this in a ViewController, you’ll now have a fully functioning bar code scanning app. You can try this yourself with the full project on Github <a href="https://github.com/davidgyoung/bar-code-scanner">here</a>. The project includes all the code above plus a few other little features that are a bit peripheral to the subject to discuss in detail:</p>
<ul>
<li>The app draws a crosshair overlay on top of the video to give guidance of where to aim the camera. (See extra credit section 2)</li>
<li>The app draws an outline around the scanned bar code area right after it is detected. (See extra credit section 3)</li>
</ul>
<h3 id="conclusion">Conclusion</h3>
<p>Adding a bar code scanner to an iOS app is quite simple and the camera works surprisingly well. Now that iOS supports scanning Bluetooth Beacons, scanning NFC tags and scanning bar codes, you have many options for making your app responsive to objects around you.</p>
Beacon Detection With Android 82017-08-07T00:00:00+00:00http://davidgyoungtech.com/2017/08/07/beacon-detection-with-android-8
<p>One of the most important ways apps use Bluetooth beacons is to get a wake up when the beacon first comes into range. If your app is running when this happens, this is no problem. But what if you want your app to take action when a beacon appears, perhaps hours or days after the user last launched it?</p>
<p>This has always been tricky on Android, because of the lack of operating system support for both beacons and for background detection. Before Android 8, tricky workarounds were necessary to make this happen.</p>
<p><strong>Android 8 includes changes that bring both good and bad news. The bad news is that the existing workarounds will no longer work to let you detect beacons when you app is not in the foreground. The good news is that new APIs give you new ways of doing these same detections</strong>, and for the most important situations, detections will be just as fast while using less battery.</p>
<p>The table below summarizes the way background beacon detections work on various Android versions. The top of the table describes the techniques supported by each Android version, and the bottom of the table shows how this converts to detection times.</p>
<hr />
<table>
<thead>
<tr>
<th> </th>
<th style="text-align: center">4.3-4.4.x</th>
<th style="text-align: center">5.0-7.x</th>
<th style="text-align: center">8.0</th>
</tr>
</thead>
<tbody>
<tr>
<td>Allows long-running scanning services</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">NO</td>
</tr>
<tr>
<td>Supports JobScheduler scanning</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td>Supports bluetooth scan filter</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td>Sends bluetooth detections as intents</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">NO</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td>Detection possible after reboot</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td>Detection possible after task manager kill</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
<td style="text-align: center">YES</td>
</tr>
<tr>
<td>Typical secs to detect first beacon</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">5</td>
<td style="text-align: center">5</td>
</tr>
<tr>
<td>Typical secs to detect second beacon</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">450</td>
</tr>
<tr>
<td>Typical secs to detect beacon disappear</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">450</td>
</tr>
<tr>
<td>Typical secs to detect after kill</td>
<td style="text-align: center">150*</td>
<td style="text-align: center">5</td>
<td style="text-align: center">450</td>
</tr>
<tr>
<td>Maximum secs to detect a beacon</td>
<td style="text-align: center">300*</td>
<td style="text-align: center">300*</td>
<td style="text-align: center">1500</td>
</tr>
</tbody>
</table>
<p>* Android Beacon Library with the default backgroundScanPeriod of 300 seconds. This may be adjusted to a higher or lower value, but lower detection times use proportionally more battery.</p>
<p><strong>Table 1. Technology Support and Beacon Detection Times Across Android Versions</strong></p>
<hr />
<p>You’ll notice from the table above that the typical background detection time remains pretty low at about 5 seconds on Android 8. But after that initial background detection, finding a second beacon can be much slower if the first beacon remains in the vicinity. And in the worst-case scenario, this can be up to 1500 seconds (25 minutes). To understand why this is true, we have to understand what is changing in Android 8.</p>
<h2 id="whats-changing">What’s Changing?</h2>
<p>Apps on Android 4.3-7.x used long-running background services or alarms to periodically look for beacons in the background. Android 8 implements new rules prohibiting long running background services in order to save battery. Apps that use background scanning services to look for beacons won’t work on Android 8. The operating system will kill the background services about 15 minutes after the app was last in the foreground. That means no more beacon detections after this 15 minutes when using background services to do the scanning.</p>
<p>Before Android 8, the Android Beacon Library would by default schedule a scan to happen every 300 seconds (5 minutes) in the background. This was the primary means of finding beacons on 4.x, and used as a backup on 5.0-7.x for the cases where hardware scan filters were all used up, didn’t work, or if beacons were already in the vicinity and the library just needed to check periodically for new ones. This guaranteed a detection to happen in 300 seconds, but on average would happen in half that time – 150 seconds.</p>
<p>Unfortunately, Android 8 no longer lets you do this. You can’t schedule background services to keep running constantly, looking for beacons every 5 minutes. You can’t use the AlarmManager to wake up your app every 5 minutes to do the same.</p>
<h2 id="the-new-way-fast-detections">The New Way: Fast Detections</h2>
<p>Fortunately, Android 8 is providing a new tool for detecting beacons in the background. New bluetooth scanning APIs let you specify a filter to match the BLE advertisement pattern you are looking for. You can then start a scan, and request your app be woken up with an Intent whenever the pattern is matched. Here’s some code that sets that up:</p>
<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nc">ScanSettings</span> <span class="n">settings</span> <span class="o">=</span> <span class="o">(</span><span class="k">new</span> <span class="nc">ScanSettings</span><span class="o">.</span><span class="na">Builder</span><span class="o">().</span><span class="na">setScanMode</span><span class="o">(</span><span class="nc">ScanSettings</span><span class="o">.</span><span class="na">SCAN_MODE_LOW_POWER</span><span class="o">)).</span><span class="na">build</span><span class="o">();</span>
<span class="nc">List</span><span class="o"><</span><span class="nc">ScanFilter</span><span class="o">></span> <span class="n">filters</span> <span class="o">=</span> <span class="n">getScanFilters</span><span class="o">();</span> <span class="c1">// Make a scan filter matching the beacons I care about</span>
<span class="nc">BluetoothManager</span> <span class="n">bluetoothManager</span> <span class="o">=</span>
<span class="o">(</span><span class="nc">BluetoothManager</span><span class="o">)</span> <span class="n">mContext</span><span class="o">.</span><span class="na">getApplicationContext</span><span class="o">().</span><span class="na">getSystemService</span><span class="o">(</span><span class="nc">Context</span><span class="o">.</span><span class="na">BLUETOOTH_SERVICE</span><span class="o">);</span>
<span class="nc">BluetoothAdapter</span> <span class="n">bluetoothAdapter</span> <span class="o">=</span> <span class="n">bluetoothManager</span><span class="o">.</span><span class="na">getAdapter</span><span class="o">();</span>
<span class="nc">Intent</span> <span class="n">intent</span> <span class="o">=</span> <span class="k">new</span> <span class="nc">Intent</span><span class="o">(</span><span class="n">mContext</span><span class="o">,</span> <span class="nc">MyBroadcastReceiver</span><span class="o">.</span><span class="na">class</span><span class="o">);</span>
<span class="n">intent</span><span class="o">.</span><span class="na">putExtra</span><span class="o">(</span><span class="s">"o-scan"</span><span class="o">,</span> <span class="kc">true</span><span class="o">);</span>
<span class="nc">PendingIntent</span> <span class="n">pendingIntent</span> <span class="o">=</span> <span class="nc">PendingIntent</span><span class="o">.</span><span class="na">getBroadcast</span><span class="o">(</span><span class="n">mContext</span><span class="o">,</span> <span class="mi">0</span><span class="o">,</span> <span class="n">intent</span><span class="o">,</span> <span class="nc">PendingIntent</span><span class="o">.</span><span class="na">FLAG_UPDATE_CURRENT</span><span class="o">);</span>
<span class="n">bluetoothAdapter</span><span class="o">.</span><span class="na">getBluetoothLeScanner</span><span class="o">().</span><span class="na">startScan</span><span class="o">(</span><span class="n">filters</span><span class="o">,</span> <span class="n">settings</span><span class="o">,</span> <span class="n">pendingIntent</span><span class="o">);</span>
</code></pre></div></div>
<p>The above code will set an Intent to fire that will trigger a call to a class called <code class="language-plaintext highlighter-rouge">MyBroadcastReceiver</code> when a matching bluetooth device is detected. You can then fetch the scan data like this:</p>
<div class="language-java highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kd">public</span> <span class="kd">class</span> <span class="nc">MyBroadcastReceiver</span> <span class="kd">extends</span> <span class="nc">BroadcastReceiver</span> <span class="o">{</span>
<span class="nd">@Override</span>
<span class="kd">public</span> <span class="kt">void</span> <span class="nf">onReceive</span><span class="o">(</span><span class="nc">Context</span> <span class="n">context</span><span class="o">,</span> <span class="nc">Intent</span> <span class="n">intent</span><span class="o">)</span> <span class="o">{</span>
<span class="kt">int</span> <span class="n">bleCallbackType</span> <span class="o">=</span> <span class="n">intent</span><span class="o">.</span><span class="na">getIntExtra</span><span class="o">(</span><span class="nc">BluetoothLeScanner</span><span class="o">.</span><span class="na">EXTRA_CALLBACK_TYPE</span><span class="o">,</span> <span class="o">-</span><span class="mi">1</span><span class="o">);</span>
<span class="k">if</span> <span class="o">(</span><span class="n">bleCallbackType</span> <span class="o">!=</span> <span class="o">-</span><span class="mi">1</span><span class="o">)</span> <span class="o">{</span>
<span class="nc">Log</span><span class="o">.</span><span class="na">d</span><span class="o">(</span><span class="no">TAG</span><span class="o">,</span> <span class="s">"Passive background scan callback type: "</span><span class="o">+</span><span class="n">bleCallbackType</span><span class="o">);</span>
<span class="nc">ArrayList</span><span class="o"><</span><span class="nc">ScanResult</span><span class="o">></span> <span class="n">scanResults</span> <span class="o">=</span> <span class="n">intent</span><span class="o">.</span><span class="na">getParcelableArrayListExtra</span><span class="o">(</span>
<span class="nc">BluetoothLeScanner</span><span class="o">.</span><span class="na">EXTRA_LIST_SCAN_RESULT</span><span class="o">);</span>
<span class="c1">// Do something with your ScanResult list here.</span>
<span class="c1">// These contain the data of your matching BLE advertising packets</span>
<span class="o">}</span>
<span class="o">}</span>
<span class="o">}</span>
</code></pre></div></div>
<p>Using the above, we can get a detection of the first new beacon being around when our app isn’t running on Android 8 as we could when the app was passively sitting in the background on iOS 5-7.x waiting for a filtered scan to come in.</p>
<p>But after we see this first beacon, the background running time limits kick in. The OS will kill the app within 15 minutes of this detection, so if we need to keep looking for a second beacon in the vicinity, and it doesn’t show up for 16 minutes, our app can no longer be running.</p>
<h2 id="the-new-way-periodic-detections">The New Way: Periodic Detections</h2>
<p>Fortunately, Android does give us another tool here: the JobScheduler. This is a relatively new means of performing periodic activities (introduced in Android 5), even if your app is not running. A scheduled job may be set up periodically to look for beacons. This is important as a backup if fast detections described above don’t work for some reason. The most common case is when another beacon has already been discovered, remains in the vicinity, and you need to know when other beacons come into and out of view.</p>
<p>This is where the JobScheduler helps us. We can set up a job to run periodically to look for the beacons around us and report them to our app.</p>
<p>But this mechanism comes with limitations. Android 8 limits periodic jobs to being scheduled at most every 15 minutes (900 seconds) meaning it will may take this long to detect a new beacon or detect when a beacon disappears, although the law of averages says the mean detection time will usually be half that much (450 seconds). You can try to schedule a job to run more than every 15 minutes, but the operating system will log something like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>06-07 22:15:51.361 6455-6455/org.altbeacon.beaconreference W/JobInfo: Specified interval for 1 is +5m10s0ms. Clamped to +15m0s0ms
06-07 22:15:51.361 6455-6455/org.altbeacon.beaconreference W/JobInfo: Specified flex for 1 is 0. Clamped to +5m0s0ms
</code></pre></div></div>
<p>The maximum time, however, can actually be significantly more than 900 seconds. This is because Android will sometimes defer periodic jobs to be even longer than the nominal minimum allowed interval of 900 seconds. In theory this “flex” interval is supposed to abide by the parameter set in the API. As you can see from the second line above, even if you request a flex of 0 seconds, the OS will refuse to use that and use the minimum flex, which appears to be 5 minutes in Android 6. This means you job 15 minute job should have at most 20 minutes between runs. But even this doesn’t always happen. In my tests, I saw cases where two jobs set up on a nominal 15 minute periodic schedule were executed 25 minutes apart. In that case, it would have taken 1500 seconds to detect a new bacon that appeared right after the last job finished. Check out the scan job run in the log excerpt below at 2:21 a.m. Note that it happened 1530 seconds after the previous run.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>06-07 22:25:51.380 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@7188bc6
06-07 22:41:01.227 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@382ed7b
06-07 22:55:51.373 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@203c928
06-07 23:10:59.083 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@dc96415
06-07 23:25:51.371 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@68bed2e
06-07 23:40:59.142 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@c295843
06-07 23:55:51.369 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@cd047e4
06-08 00:10:59.082 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@8009a61
06-08 00:25:51.368 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@f1fa2ca
06-08 00:40:59.085 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@88dddef
06-08 00:55:51.374 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@eb2b360
06-08 01:10:51.670 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@9bca225
06-08 01:25:51.383 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@871c8fe
06-08 01:45:51.404 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@3bf42d3
06-08 01:56:12.354 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@c3d4e34
06-08 02:21:51.771 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@1557571
06-08 02:37:01.861 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@e2c879a
06-08 02:52:11.943 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@c9f0d7f
06-08 03:07:22.041 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@4e0cab0
06-08 03:23:12.696 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@1139a7d
06-08 03:38:22.776 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@e06b8f6
06-08 03:52:12.792 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@74147eb
06-08 04:08:32.872 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@90d9fec
06-08 04:21:12.856 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@a4abd49
06-08 04:38:42.959 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@741d912
06-08 04:50:12.923 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@15bfe17
06-08 05:08:53.047 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@fa229e8
06-08 05:19:13.050 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@b0e49d5
06-08 05:39:03.142 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@18823ee
06-08 05:54:13.212 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@a72fc03
06-08 06:10:51.850 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@3fb84a4
06-08 06:26:01.917 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@53d6c21
06-08 06:41:11.994 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@848958a
06-08 06:56:22.053 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@43cdaf
06-08 07:06:32.119 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@5318c20
06-08 07:29:12.356 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@34f102d
06-08 07:44:22.431 6455-6455/org.altbeacon.beaconreference I/ScanJob: Running periodic scan job: instance is org.altbeacon.beacon.service.ScanJob@4d2e9e6
</code></pre></div></div>
<p>These numbers are based on a limited sample size in an overnight test of about 9 hours, so it is certainly possible that even longer periods between scans may happen on Android 8. We know it can be up to 25 minutes between scans. But it could be even more.</p>
<p>The fact that such delays might sometimes happen with Android 8 is disappointing. But at least we understand why. On iOS, we also see similar detection delays in some cases, but the closed-source code means we have no idea why.</p>
<p>When testing, just remember that this 15 minute interval is typical, but not exact. Expect scans in these cases to happen every 10-25 minutes.</p>
<h2 id="upgrading-your-app">Upgrading Your App</h2>
<p>If you have a beacon app targeting earlier versions of Android that relies on long-running services for detections (e.g. the Android Beacon Library), you’ll need to modify these apps and submit them to the store so they’ll continue to be able to detect beacons in the background on Android 8. If you don’t, people who run your unmodified app on Android 8 will only be able to detect beacons for 15 minutes after last being in the foreground.</p>
<p>For users of the Android Beacon Library, the upgrade process is simple. Simply upgrade to the latest Android Beacon Library 2.12+. This version uses all of the techniques above to deliver the best detection times possible as described in the table above, in a way that is background compatible. The old means of scanning and detection times will still apply for apps running on Android 4.x-7.x. Only apps deployed on Android 8 will use the new techniques. So the slower worst-case detection times won’t affect users on earlier operating system versions.</p>
<p>If you are using this library and have customized the background scan intervals to be more frequent than 15 minutes, be aware that the scans won’t happen. at that frequency, and even scans set to happen every 15 minutes won’t happen at precisely that time interval. So be aware. When your boss asks you why your app isn’t triggering right away, you can answer with everything you read above.</p>
Detecting Beacons With Android Things2017-06-18T00:00:00+00:00http://davidgyoungtech.com/2017/06/18/detecting-beacons-with-android-things
<p>The new Android Things platform opens up a new way to build powerful Internet of Things systems using the
ubiquitous Android platform. For those who already know Android development, this offers a very low
barrier to entry. And for those building Bluetooth LE systems, the relatively robust BlueDroid
bluetooth stack is a welcome change from the buggy, unstable and poorly-documented BlueZ bluetooth stack standard
on non-Android Linux systems.</p>
<p>For all it’s power, Android Things has real drawbacks. It takes over a minute to boot, which is an eternity for an embedded system. It is designed for relatively high-powered devices with 512 Mb of RAM or more. And while tiny computers like the Edison board and the Raspberry Pi 3 support it, the ultra tiny and crazy cheap $10 Raspberry Pi Zero W is unfortunately off-limits for this platform. Its ARMv6 processor is not supported by Android at all.</p>
<p>But if you can live with these boot times and can work with the Raspberry Pi 3, this article will show you how to detect beacons with the Android Things platform.</p>
<h1 id="tutorial">Tutorial</h1>
<p>This tutorial assumes you have basic Android development skills. If you haven’t developed an Android app before, you might want to go through a simple Android app tutorial first.</p>
<p>To get started detecting beacons with Android Things, you’ll need the following:</p>
<ul>
<li>Raspberry Pi 3 with Micro SD card (or other supported board)</li>
<li>USB card reader for your workstation</li>
<li>Ethernet cable(s)</li>
<li>Access to a router on the same subnet as your workstation</li>
<li>Android Studio 3 (You must have version 3+ to using the Android Things new project creation tool)</li>
<li>A mobile phone with the Locate app for iOS or Android you can use to transmit a beacon signal</li>
<li>Optional: A monitor with a HDMI cable (useful for troubleshooting boot problems)</li>
</ul>
<h2 id="step-1-download-android-things">Step 1: Download Android Things</h2>
<p>First, locate the system image for your hardware board on the Android Developers site <a href="https://developer.android.com/things/preview/download.html">here</a>.</p>
<p>Once it is downloaded, unzip it and prepare to flash the image to your board.</p>
<h2 id="step-2-flash-the-system-image">Step 2: Flash the system image</h2>
<p>The instructions to flash the system image to the micro SD card depends on the workstation you are using. On my mac, I use the <code class="language-plaintext highlighter-rouge">dd</code> command, but equivalent instructions exist for Windows and Linux here: <a href="https://developer.android.com/things/hardware/raspberrypi.html">https://developer.android.com/things/hardware/raspberrypi.html</a></p>
<p>Once the flash completes, simply remove the micro SD card from your workstation, put it in your board.</p>
<h2 id="step-3-set-up-a-console-connection">Step 3. Set up a console connection</h2>
<p>Android Things is an embedded system, so you don’t control the device from a user interface. You control it through a console (sometimes using the Android Debug Bridge (ADB) tool or GUI tools talking over ADB like Android Studio or Android Monitor.) You can use a special serial cable to connect your Raspberry Pi to your workstation via a USB port, but if you have access to an open ethernet port on the same subnet as your workstation, this is by far the easiest way.</p>
<p>Simply use an ethernet cable to connect your Raspberry Pi to the network, and power it on. If all goes well, you will see lights flash on the Pi board. Wait about 90 seconds for it to boot up, then type the following on your workstation where you have the Android development tools installed:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>adb connect Android.local
</code></pre></div></div>
<p>If it works, you’ll see a response like <code class="language-plaintext highlighter-rouge">connected to Android.local:5555</code>. Running this command is the equivalent of connecting a USB cable to a development Android phone, when USB debugging is enabled. The ADB tool on your workstation now has a connection to your device.</p>
<h2 id="step-4-set-up-wifi-optional">Step 4. Set up WiFi (optional)</h2>
<p>This is enough to Android Studio working with your board, but you might also want to set up a WiFi connection for your Raspberry Pi so it doesn’t have to stay connected to the ethernet cable. If you set it up with a WiFi network on the same subnet as your workstation, you’ll be able to perform the same command above without the Pi connected via an ethernet cable.</p>
<p>To do this, you first run the <code class="language-plaintext highlighter-rouge">adb shell</code> command on your workstation to get a console into the Pi. Then you run the following command to set up the wifi for your network:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>am startservice com.google.wifisetup/.WifiSetupService -a WifiSetupService.Connect -e ssid MY_SSID_NAME -e MY_PASSPRHRASE
</code></pre></div></div>
<p>You’ll of course need to replace MY_SSID_NAME and MY_PASSPHRASE with the credentials for your wifi network.</p>
<h2 id="step-5-create-a-new-android-things-app">Step 5. Create a new Android Things app</h2>
<p>Now that you have a connection to your Android Things board from your workstation, you are ready to use Android Studio to build your app. Making a new Android Things app is an awful lot like making a regular Android app. If you have Android Studio 3+, there is a tool for starting new Android Things projects by going to: File -> New -> New Project</p>
<p>Follow the screenshots like shown below.</p>
<p><img src="/images/new-project-1.png" width="640px" /></p>
<p><img src="/images/new-project-2.png" width="640px" /></p>
<p><img src="/images/new-project-3.png" width="640px" /></p>
<p><img src="/images/new-project-4.png" width="640px" /></p>
<p>Once the project template is created, take a look at the AndroidManifest.xml. You’ll see this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><?xml version="1.0" encoding="utf-8"?>
<manifest package="com.davidgyoungtech.androidthingsbeaconscanner"
xmlns:android="http://schemas.android.com/apk/res/android">
<application>
<uses-library android:name="com.google.android.things"/>
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.LAUNCHER"/>
</intent-filter>
<intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.IOT_LAUNCHER"/>
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</activity>
</application>
</manifest>
</code></pre></div></div>
<p>This looks like a regular Android App manifest with a special <intent-filter> added to the main activity:</intent-filter></p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><intent-filter>
<action android:name="android.intent.action.MAIN"/>
<category android:name="android.intent.category.IOT_LAUNCHER"/>
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</code></pre></div></div>
<p>This filter is what will launch our app automatically when the device boots up. Android Things apps are based on Activities just like regular Android apps. The big difference is that a user interface is optional (if a display is attached, you can show regular Android UI elements) and Activities can run at boot (and keep running forever) without user interaction.</p>
<h2 id="step-6-code-beacon-detections">Step 6. Code Beacon Detections</h2>
<p>Edit the build.gradle (Module: app) file so it looks like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>dependencies {
compile 'org.altbeacon:android-beacon-library:2.9.1'
...
</code></pre></div></div>
<p>Then go to MainActivity.java and add these four lines to the end to the onCreate method:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>BeaconManager beaconManager = BeaconManager.getInstanceForApplication(this);
beaconManager.getBeaconParsers().clear();
beaconManager.getBeaconParsers().add(new BeaconParser("iBeacon").setBeaconLayout("m:2-3=0215,i:4-19,i:20-21,i:22-23,p:24-24"));
beaconManager.bind(this);
</code></pre></div></div>
<p>These lines set the Android Beacon Library up to look for iBeacons.</p>
<p>You’ll get an error flagged on the last line, because we have not yet made our Activity implement the BeaconConsumer interface. Let’s fix that. Change the class definition to look like below. And while we’re at it, let’s add a TAG definition so we can log debug lines:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>public class MainActivity extends Activity implements BeaconConsumer, RangeNotifier {
private static final String TAG = MainActivity.class.getSimpleName();
</code></pre></div></div>
<p>The changes to the class definition above adds two interfaces to the Activity, one to connect to the Android Beacon Library and the other to get beacon ranging callbacks from it. You’ll see errors that not all the methods of the interfaces have been implemented. So you’ll need to add the following to the body of the class:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>@Override
public void onBeaconServiceConnect() {
Log.d(TAG, "Beacon service connected. Starting ranging.");
try {
mBeaconManager.startRangingBeaconsInRegion(new Region("allbeacons", null, null, null));
mBeaconManager.addRangeNotifier(this);
} catch (RemoteException e) {
e.printStackTrace();
}
}
@Override
public void didRangeBeaconsInRegion(Collection<Beacon> beacons, Region region) {
for (Beacon beacon: beacons) {
Log.d(TAG, "Detected beacon: "+beacon);
}
}
</code></pre></div></div>
<p>This code waits for the library to be initialized, at which point <code class="language-plaintext highlighter-rouge">onBaconServiceConnect</code> gets called. Inside that method, it starts beacon ranging, defining a “Region” of beacons to match that has all identifiers set to null – this effectively makes it match any beacon it sees. It also sets the RangeNotifier to be this same MainActivity class. That makes it so that the <code class="language-plaintext highlighter-rouge">didRangeBeaconsInRegion</code> method below will get called once per second with a list of all beacons that are detected. And that method definition simply loops through all of the detected beacons and logs them.</p>
<p>We can now build and run this app though Android Studio. Choose Run -> Run App, and Android Studio will build your application APK package, upload it to the Android Things board, and start running it. If you don’t have a display, you won’t see anything. But you’ll see log lines in LogCat window in Android Studio. You should see something like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>06-12 13:11:37.691 31695 31695 D MainActivity: Beacon service connected. Starting ranging.
</code></pre></div></div>
<h2 id="step-7-obtain-permissions">Step 7. Obtain Permissions</h2>
<p>We haven’t detected any beacons yet. And in case you have jumped the gun and turned on a beacon transmitter, you’ll notice it doesn’t work to detect beacons. Why?</p>
<p>If you’ve ever worked with Bluetooth on Android before, you might know that Android requires you to request and get permission from the user to access the device’s location in order to scan for Bluetooth beacons. It does this because beacons are often used to figure out the user’s location, so the need to declare this permission int the AndroidManifest.xml as of Android 7. Because Android Things preview release is based on Android 7, the same requirement applies. What’s more, because Android classifies location as a “dangerous” permission, it must be dynamically requested from the user at runtime. But Android Things is designed to run without a user interface. So how can this work?</p>
<p>Android Things solves this by automatically granting dangerous permissions at boot time to any applications that declare the need for them in its AndroidManifest.xml. The Android Beacon Library automatically includes these in its manifest, and they get merged to your application’s manifest by Android Studio during the build. You can see the merged manifest in Android Studio 3 by bringing up AndroidManifest.xml and tapping the “Merged Manifest” tab at the bottom of the screen.</p>
<p><img src="/images/things-manifest-merged.png" width="640px" /></p>
<p>As you can see, the following permission was brought in automatically by the AndroidBeaconLibrary:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><uses-permission-sdk-23 android:name="android.permission.ACCESS_COARSE_LOCATION" />
</code></pre></div></div>
<p>This is enough to get us going, but it won’t work until we reboot the Android Things device after installing the app. Trying to detect bluetooth devices (or access other location APIs) will fail silently until you do this. If you write code to check if the permission has been granted, it will return false until after a reboot. And if you write code to dynamically request the permission from the user</p>
<p>So long story short: just reboot your Android Things device after installing your app the first time!</p>
<p>Turn off your Android Things device, turn it back on, and wait about 90 seconds for it to boot.</p>
<h2 id="step-8-test">Step 8. Test</h2>
<p>After you have rebooted your device, you’ll need to reconnect adb again:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>adb connect Android.local
</code></pre></div></div>
<p>Once you’ve done that, you should see your log line in the LogCat pane in Android Studio like this:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>06-12 13:11:37.691 31695 31695 D MainActivity: Beacon service connected. Starting ranging.
</code></pre></div></div>
<p>This indicates that your app started up automatically at boot, something we configured in
the AndroidManifest.xml.</p>
<p>Now it’s time to turn on a beacon and see if we detect it. Go to the Locate app on your iOS or Android phone, and turn on a beacon with any identifier. The screenshot below shows the iOS version. The Android version looks different but does the same thing.</p>
<p><img style="border: 2px;" src="/images/ios-transmitter.png" width="320px" /></p>
<p>As soon as you turn on the beacon transmitter, you should see log lines in Android Studio show up like below:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>06-12 13:12:44.671 31695 31695 D MainActivity: Detected beacon: id1: 2F234454-CF6D-4A0F-ADF2-F4911BA9FFA6 id2: 1 id3: 2 type iBeacon
</code></pre></div></div>
<p>Congratulations! You’ve just detected a beacon with Andorid Things.</p>
Battery-Friendly Beacon Transmission2015-11-12T00:00:00+00:00http://davidgyoungtech.com/2015/11/12/battery-friendly-beacon-transmission<p>While most beacon apps focus on detection of bluetooth beacons, some of the coolest mobile apps you can make involve making the device transmit as a beacon as well. Think about an app that lets friends know they are nearby one another. When two users walk within about 50 meters of each other, presto! The app alerts you that a friend is nearby.</p>
<p>Apple has supported beacon transmission since iOS 6, and most Android devices have done the same starting with Android 5.0. But lots of folks are wary of transmitting all the time. The conventional wisdom is that app developers need to be careful about transmitting because of its impact on battery. We all know that angry users with drained batteries will quickly uninstall your app. Is transmitting as a beacon really that bad?</p>
<p><i><a href="http://developer.radiusnetworks.com/2015/12/09/battery-friendly-beacon-transmission">Continue reading this blog post</a> on the Radius Networks website</i></p>