<?xml version="1.0" encoding="utf-8"?>
<!-- generator="Kirby" -->
<rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom">

  <channel>
    <title>Tag: mobile &#183; Blog &#183; Liip</title>
    <link>https://www.liip.ch/de/blog/tags/mobile</link>
    <generator>Kirby</generator>
    <lastBuildDate>Tue, 23 Oct 2018 00:00:00 +0200</lastBuildDate>
    <atom:link href="https://www.liip.ch" rel="self" type="application/rss+xml" />

        <description>Liip Blog Artikel mit dem Tag &#8220;mobile&#8221;</description>
    
        <language>de</language>
    
        <item>
      <title>Real time numbers recognition (MNIST) on an iPhone with CoreML from A to Z</title>
      <link>https://www.liip.ch/de/blog/numbers-recognition-mnist-on-an-iphone-with-coreml-from-a-to-z</link>
      <guid>https://www.liip.ch/de/blog/numbers-recognition-mnist-on-an-iphone-with-coreml-from-a-to-z</guid>
      <pubDate>Tue, 23 Oct 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<h1>Creating a CoreML model from A-Z in less than 10 Steps</h1>
<p>This is the third part of our deep learning on mobile phones series. In part one I have shown you <a href="https://www.liip.ch/en/blog/poke-zoo-or-making-deep-learning-tell-oryxes-apart-from-lamas-in-a-zoo-part-1-the-idea-and-concepts">the two main tricks on how to use convolutions and pooling to train deep learning networks</a>. In part two I have shown you <a href="https://www.liip.ch/en/blog/zoo-pokedex-part-2-hands-on-with-keras-and-resnet50">how to train existing deep learning networks like resnet50 to detect new objects</a>. In part three I will now show you how to train a deep learning network, how to convert it in the CoreML format and then deploy it on your mobile phone! </p>
<p>TLDR: I will show you how to create your own iPhone app from A-Z that recognizes handwritten numbers: </p>
<figure><img src="https://liip.rokka.io/www_inarticle/812493/output.gif" alt=""></figure>
<p>Let’s get started!</p>
<h2>1. How to start</h2>
<p>To have a fully working example I thought we’d start with a toy dataset like the <a href="https://en.wikipedia.org/wiki/MNIST_database">MNIST set of handwritten letters</a> and train a deep learning network to recognize those. Once it’s working nicely on our PC, we will port it to an iPhone X using the <a href="https://developer.apple.com/documentation/coreml">CoreML standard</a>. </p>
<h2>2. Getting the data</h2>
<pre><code class="language-python"># Importing the dataset with Keras and transforming it
from keras.datasets import mnist
from keras import backend as K

def mnist_data():
    # input image dimensions
    img_rows, img_cols = 28, 28
    (X_train, Y_train), (X_test, Y_test) = mnist.load_data()

    if K.image_data_format() == 'channels_first':
        X_train = X_train.reshape(X_train.shape[0], 1, img_rows, img_cols)
        X_test = X_test.reshape(X_test.shape[0], 1, img_rows, img_cols)
        input_shape = (1, img_rows, img_cols)
    else:
        X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 1)
        X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 1)
        input_shape = (img_rows, img_cols, 1)

    # rescale [0,255] --&gt; [0,1]
    X_train = X_train.astype('float32')/255
    X_test = X_test.astype('float32')/255

    # transform to one hot encoding
    Y_train = np_utils.to_categorical(Y_train, 10)
    Y_test = np_utils.to_categorical(Y_test, 10)

    return (X_train, Y_train), (X_test, Y_test)

(X_train, Y_train), (X_test, Y_test) = mnist_data()</code></pre>
<h2>3. Encoding it correctly</h2>
<p>When working with image data we have to distinguish how we want to encode it. Since Keras is a high level-library that can work on multiple “backends” such as <a href="https://www.tensorflow.org">Tensorflow</a>, <a href="http://deeplearning.net/software/theano/">Theano</a>  or <a href="https://www.microsoft.com/en-us/cognitive-toolkit/">CNTK</a>, we have to first find out how our backend encodes the data. It can either be encoded in a “channels first” or in a “channels last” way which is the default in Tensorflow in the <a href="https://keras.io/backend/">default Keras backend</a>. So in our case, when we use Tensorflow it would be a tensor of (batch_size, rows, cols, channels). So we first input the batch_size, then the 28 rows of the image, then the 28 columns of the image and then a 1 for the number of channels since we have image data that is grey-scale.  </p>
<p>We can take a look at the first 5 images that we have loaded with the following snippet:</p>
<pre><code class="language-python"># plot first six training images
import matplotlib.pyplot as plt
%matplotlib inline
import matplotlib.cm as cm
import numpy as np

(X_train, y_train), (X_test, y_test) = mnist.load_data()

fig = plt.figure(figsize=(20,20))
for i in range(6):
    ax = fig.add_subplot(1, 6, i+1, xticks=[], yticks=[])
    ax.imshow(X_train[i], cmap='gray')
    ax.set_title(str(y_train[i]))</code></pre>
<figure><img src="https://liip.rokka.io/www_inarticle/7cce04/numbers.png" alt=""></figure>
<h2>4. Normalizing the data</h2>
<p>We see that there are white numbers on a black background, each thickly written just in the middle and they are quite low resolution - in our case 28 pixels x 28 pixels. </p>
<p>You have noticed that above we are rescaling each of the image pixels, by dividing them by 255. This results in pixel values between 0 and 1 which is quite useful for any kind of training. So each of the images pixel values look like this before the transformation:</p>
<pre><code class="language-python"># visualize one number with pixel values
def visualize_input(img, ax):
    ax.imshow(img, cmap='gray')
    width, height = img.shape
    thresh = img.max()/2.5
    for x in range(width):
        for y in range(height):
            ax.annotate(str(round(img[x][y],2)), xy=(y,x),
                        horizontalalignment='center',
                        verticalalignment='center',
                        color='white' if img[x][y]&lt;thresh else 'black')

fig = plt.figure(figsize = (12,12)) 
ax = fig.add_subplot(111)
visualize_input(X_train[0], ax)</code></pre>
<figure><img src="https://liip.rokka.io/www_inarticle/6d0772/detail.png" alt=""></figure>
<p>As you noticed each of the grey pixels has a value between 0 and 255 where 255 is white and 0 is black. Notice that here <code>mnist.load_data()</code> loads the original data into X_train[0]. When we write our custom mnist_data() function we transform every pixel intensity into a value of 0-1 by calling  <code>X_train = X_train.astype('float32')/255 </code>. </p>
<h2>5. One hot encoding</h2>
<p>Originally the data is encoded in such a way that the Y-Vector contains the number value that the X Vector (Pixel Data) contains. So for example if it looks like a 7, the Y-Vector contains the number 7 in there. We need to do this transformation, because we want to map our output to 10 output neurons in our network that fire when the according number is recognized. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/46a2ef/onehot.png" alt=""></figure>
<h2>6. Modeling the network</h2>
<p>Now it is time to define a convolutional network to distinguish those numbers. Using the <a href="https://www.liip.ch/en/blog/poke-zoo-or-making-deep-learning-tell-oryxes-apart-from-lamas-in-a-zoo-part-1-the-idea-and-concepts">convolution and pooling tricks from part one of this series</a> we can model a network that will be able to distinguish numbers from each other. </p>
<pre><code class="language-python"># defining the model
from keras.models import Sequential
from keras.layers import Dense, Dropout, Flatten
from keras.layers import Conv2D, MaxPooling2D
def network():
    model = Sequential()
    input_shape = (28, 28, 1)
    num_classes = 10

    model.add(Conv2D(filters=32, kernel_size=(3, 3), padding='same', activation='relu', input_shape=input_shape))
    model.add(MaxPooling2D(pool_size=2))
    model.add(Conv2D(filters=32, kernel_size=2, padding='same', activation='relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Conv2D(filters=32, kernel_size=2, padding='same', activation='relu'))
    model.add(MaxPooling2D(pool_size=(2, 2)))
    model.add(Dropout(0.3))
    model.add(Flatten())
    model.add(Dense(500, activation='relu'))
    model.add(Dropout(0.4))
    model.add(Dense(num_classes, activation='softmax'))

    # summarize the model
    # model.summary()
    return model </code></pre>
<p>So what did we do there? Well we started with a <a href="https://keras.io/layers/convolutional/">convolution</a> with a kernel size of 3. This means the window is 3x3 pixels. The input shape is our 28x28 pixels.  We then followed this layer by a <a href="https://keras.io/layers/pooling/">max pooling layer</a>. Here the pool_size is two so we downscale everything by 2. So now our input to the next convolutional layer is 14 x 14. We then repeated this two more times ending up with an input to the final convolution layer of 3x3. We then use a <a href="https://keras.io/layers/core/#dropout">dropout layer</a> where we randomly set 30% of the input units to 0 to prevent overfitting in the training. Finally we then flatten the input layers (in our case 3x3x32 = 288) and connect them to the dense layer with 500 inputs. After this step we add another dropout layer and finally connect it to our dense layer with 10 nodes which corresponds to our number of classes (as in the number from 0 to 9). </p>
<h2>7. Training the model</h2>
<pre><code class="language-python">#Training the model
model.compile(loss='categorical_crossentropy', optimizer=keras.optimizers.Adadelta(), metrics=['accuracy'])

model.fit(X_train, Y_train, batch_size=512, epochs=6, verbose=1,validation_data=(X_test, Y_test))

score = model.evaluate(X_test, Y_test, verbose=0)

print('Test loss:', score[0])
print('Test accuracy:', score[1])</code></pre>
<p>We first compile the network by defining a loss function and an optimizer: in our case we select categorical_crossentropy, because we have multiple categories (as in the numbers 0-9). There are a number of optimizers that <a href="https://keras.io/optimizers/#usage-of-optimizers">Keras offers</a>, so feel free to try out a few, and stick with what works best for your case. I’ve found that AdaDelta (an advanced form of AdaGrad) works fine for me. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/42b4b8/train.png" alt=""></figure>
<p>So after training I’ve got a model that has an accuracy of 98%, which is quite excellent given the rather simple network infrastructure. In the screenshot you can also see that in each epoch the accuracy was increasing, so everything looks good to me. We now have a model that can quite well predict the numbers 0-9 from their 28x28 pixel representation. </p>
<h2>8. Saving the model</h2>
<p>Since we want to use the model on our iPhone we have to convert it to a format that our iPhone understands. There is actually an ongoing initiative from Microsoft, Facebook and Amazon (and others) to harmonize all of the different deep learning network formats to have an interchangable open neural networks exchange format that you can use on any device. Its called <a href="https://onnx.ai">ONNX</a>. </p>
<p>Yet, as of today Apple devices work only with the CoreML format though. In order to convert our Keras model to CoreML Apple luckily provides  a very handy helper library called <a href="https://apple.github.io/coremltools/generated/coremltools.converters.keras.convert.html">coremltools</a> that we can use to get the job done. It is able to convert scikit-learn models, Keras and XGBoost models to CoreML, thus covering quite a bit of the everyday applications.  Install it with “pip install coremltools” and then you will be able to use it easily. </p>
<pre><code class="language-python">coreml_model = coremltools.converters.keras.convert(model,
                                                    input_names="image",
                                                    image_input_names='image',
                                                    class_labels=['0', '1', '2', '3', '4', '5', '6', '7', '8', '9']
                                                    )</code></pre>
<p>The most important parameters are class_labels, they define how many classes the model is trying to predict, and input names or image_input_names. By setting them to <ode>image</code> XCode will automatically recognize that this model is about taking in an image and trying to predict something from it. Depending on your application it makes a lot of sense to study the <a href="https://apple.github.io/coremltools/generated/coremltools.converters.keras.convert.html">documentation</a>, especially when you want to make sure that it encodes the RGB channels in the same order (parameter is_bgr) or making sure that it assumes correctly that all inputs are values between 0 and 1 (parameter image_scale) . </p>
<p>The only thing left is to add some metadata to your model. With this you are helping all the developers greatly, since they don’t have to guess how your model is working and what it expects as input. </p>
<pre><code class="language-python">#entering metadata
coreml_model.author = 'plotti'
coreml_model.license = 'MIT'
coreml_model.short_description = 'MNIST handwriting recognition with a 3 layer network'
coreml_model.input_description['image'] = '28x28 grayscaled pixel values between 0-1'
coreml_model.save('SimpleMnist.mlmodel')

print(coreml_model)</code></pre>
<h2>9. Use it to predict something</h2>
<p>After saving the model to a CoreML model we can try if it works correctly on our machine. For this we can feed it with an image and try to see if it predicts the label correctly. You can use the MNIST training data or you can snap a picture with your phone and transfer it on your PC to see how well the model handles real-life data. </p>
<pre><code class="language-python">#Use the core-ml model to predict something
from PIL import Image  
import numpy as np
model =  coremltools.models.MLModel('SimpleMnist.mlmodel')
im = Image.fromarray((np.reshape(mnist_data()[0][0][12]*255, (28, 28))).astype(np.uint8),"L")
plt.imshow(im)
predictions = model.predict({'image': im})
print(predictions)</code></pre>
<p>It works hooray! Now it's time to include it in a project in XCode. </p>
<h1>Porting our model to XCode in 10 Steps</h1>
<p>Let me start by saying: I am by no means a XCode or Mobile developer. I have studied a <a href="https://github.com/markmansur/CoreML-Vision-demo">quite a few</a> <a href="https://sriraghu.com/2017/06/15/computer-vision-in-ios-object-recognition/">super</a> <a href="https://www.raywenderlich.com/577-core-ml-and-vision-machine-learning-in-ios-11-tutorial">helpful tutorials</a>, <a href="https://www.pyimagesearch.com/2018/04/23/running-keras-models-on-ios-with-coreml/">walkthroughs</a>  and <a href="https://www.youtube.com/watch?v=bOg8AZSFvOc">videos</a> on how to create a simple mobile phone app with CoreML and have used those to create my app. I can only say a big thank you and kudos to the community being so open and helpful. </p>
<h2>1. Install XCode</h2>
<p>Now it's time to really get our hands dirty. Before you can do anything you have to have XCode. So download it via <a href="https://itunes.apple.com/us/app/xcode/id497799835?mt=12">Apple-Store</a> and install it. In case you already have it, make sure to have at least version 9 and above. </p>
<h2>2. Create the Project</h2>
<p>Start XCode and create a single view app. Name your project accordingly.  I did name mine “numbers”. Select a place to save it. You can leave “create git repository on my mac” checked. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/26a145/single.png" alt=""></figure>
<h2>3. Add the CoreML model</h2>
<p>We can now add the CoreML model that we created using the coremltools converter. Simply drag the model into your project directory. Make sure to drag it into the correct folder (see screenshot). You can use the option “add as Reference”, like this whenever you update your model, you don’t have to drag it into your project again to update it. XCode should automatically recognize your model and realize that it is a model to be used for images. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/d4115c/addmodel.png" alt=""></figure>
<h2>4. Delete the view or storyboard</h2>
<p>Since we are going to use just the camera and display a label we don’t need a fancy graphical user interface - or in other words a view layer. Since the storyboard in Swing corresponds to the view in the MVC pattern we are going to simply delete it. In the project settings deployment info make sure to delete the Main Interface too (see screenshot), by setting it to blank.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/8f4709/storyboard.png" alt=""></figure>
<h2>5. Create the root view controller programmatically</h2>
<p>Instead we are going to create view root controller programmatically by replacing the <code>funct application</code> in AppDelegate.swift with the following code:</p>
<pre><code class="language-swift">// create the view root controller programmatically
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -&gt; Bool {
    // create the user interface window, make it visible
    window = UIWindow()
    window?.makeKeyAndVisible()

    // create the view controller and make it the root view controller
    let vc = ViewController()
    window?.rootViewController = vc

    // return true upon success
    return true
}</code></pre>
<h2>6. Build the view controller</h2>
<p>Finally it is time to build the view controller. We will use UIKit - a lib for creating buttons and labels, AVFoundation - a lib to capture the camera on the iPhone and Vision - a lib to handle our CoreML model. The last is especially handy if you don’t want to resize the input data yourself. </p>
<p>In the Viewcontroller we are going to inherit from UI and AV functionalities, so we need to overwrite some methods later to make it functional. </p>
<p>The first thing we will do is to create a label that will tell us what the camera is seeing. By overriding the <code>viewDidLoad</code> function we will trigger the capturing of the camera and add the label to the view. </p>
<p>In the function <code>setupCaptureSession</code> we will create a capture session, grab the first camera (which is the front facing one) and capture its output into <code>captureOutput</code> while also displaying it on the <code>previewLayer</code>. </p>
<p>In the function <code>captureOutput</code> we will finally make use of our CoreML model that we imported before. Make sure to hit Cmd+B - build, when importing it, so XCode knows it's actually there. We will use it to predict something from the image that we captured. We will then grab the first prediction from the model and display it in our label. </p>
<pre><code class="language-swift">\\define the ViewController
import UIKit
import AVFoundation
import Vision

class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
    // create a label to hold the Pokemon name and confidence
    let label: UILabel = {
        let label = UILabel()
        label.textColor = .white
        label.translatesAutoresizingMaskIntoConstraints = false
        label.text = "Label"
        label.font = label.font.withSize(40)
        return label
    }()

    override func viewDidLoad() {
        // call the parent function
        super.viewDidLoad()       
        setupCaptureSession() // establish the capture
        view.addSubview(label) // add the label
        setupLabel()
    }

    func setupCaptureSession() {
        // create a new capture session
        let captureSession = AVCaptureSession()

        // find the available cameras
        let availableDevices = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back).devices

        do {
            // select the first camera (front)
            if let captureDevice = availableDevices.first {
                captureSession.addInput(try AVCaptureDeviceInput(device: captureDevice))
            }
        } catch {
            // print an error if the camera is not available
            print(error.localizedDescription)
        }

        // setup the video output to the screen and add output to our capture session
        let captureOutput = AVCaptureVideoDataOutput()
        captureSession.addOutput(captureOutput)
        let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer.frame = view.frame
        view.layer.addSublayer(previewLayer)

        // buffer the video and start the capture session
        captureOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "videoQueue"))
        captureSession.startRunning()
    }

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        // load our CoreML Pokedex model
        guard let model = try? VNCoreMLModel(for: SimpleMnist().model) else { return }

        // run an inference with CoreML
        let request = VNCoreMLRequest(model: model) { (finishedRequest, error) in

            // grab the inference results
            guard let results = finishedRequest.results as? [VNClassificationObservation] else { return }

            // grab the highest confidence result
            guard let Observation = results.first else { return }

            // create the label text components
            let predclass = "\(Observation.identifier)"

            // set the label text
            DispatchQueue.main.async(execute: {
                self.label.text = "\(predclass) "
            })
        }

        // create a Core Video pixel buffer which is an image buffer that holds pixels in main memory
        // Applications generating frames, compressing or decompressing video, or using Core Image
        // can all make use of Core Video pixel buffers
        guard let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

        // execute the request
        try? VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:]).perform([request])
    }

    func setupLabel() {
        // constrain the label in the center
        label.centerXAnchor.constraint(equalTo: view.centerXAnchor).isActive = true

        // constrain the the label to 50 pixels from the bottom
        label.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -50).isActive = true
    }
}</code></pre>
<p>Make sure that you have changed the model part to the naming of your model. Otherwise you will get build errors. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/b4364b/modeldetails.png" alt=""></figure>
<h2>6. Add Privacy Message</h2>
<p>Finally, since we are going to use the camera, we need to inform the user that we are going to do so, and thus add a privacy message “Privacy - Camera Usage Description”  in the Info.plist file under Information Property List. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/29ab1e/privacy.png" alt=""></figure>
<h2>7. Add a build team</h2>
<p>In order to deploy the app on your mobile iPhone, you will need to <a href="https://developer.apple.com/programs/enroll/">register with the Apple developer program</a>. There is no need to pay any money to do so, <a href="https://9to5mac.com/2016/03/27/how-to-create-free-apple-developer-account-sideload-apps/">you can register also without any fees</a>. Once you are registered you can select the team Apple calls it this way) that you have signed up there in the project properties. </p>
<h2>8. Deploy on your iPhone</h2>
<p>Finally it's time to deploy the model on your iPhone. You will need to connect it via USB and then unlock it. Once it's unlocked you need to select the destination under Product - Destination- Your iPhone. Then the only thing left is to run it on your mobile: Select Product - Run (or simply hit CMD + R) in the Menu and XCode will build and deploy the project on your iPhone. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/7cc4f5/destination.png" alt=""></figure>
<h2>9. Try it out</h2>
<p>After having had to jump through so many hoops it is finally time to try out our app. If you are starting it for the first time it will ask you to allow it to use your camera (after all we have placed this info there). Then make sure to hold your iPhone sideways, since it matters on how we trained the network. We have not been using any augmentation techniques, so our model is unable to recognize numbers that are “lying on the side”. We could make our model better by applying these techniques as I have shown in <a href="https://www.liip.ch/en/blog/zoo-pokedex-part-2-hands-on-with-keras-and-resnet50">this blog article</a>.</p>
<p>A second thing you might notice is, that the app always recognizes some number, as there is no “background” class. In order to fix this, we could train the model additionally on some random images, which we classify as the background class. This way our model would be better equipped to tell apart if it is seeing a number or just some random background. </p>
<figure><img src="https://liip.rokka.io/www_inarticle/812493/output.gif" alt=""></figure>
<h2>Conclusion or the famous “so what”</h2>
<p>Obviously this has is a very long blog post. Yet I wanted to get all the necessary info into one place in order to show other mobile devs how easy it is to create your own deep learning computer vision applications. In our case at Liip it will most certainly boil down to a collaboration between our <a href="https://www.liip.ch/en/work/data">data services team</a> and our mobile developers in order to get the best of both worlds. </p>
<p>In fact we are currently innovating together by creating an app that <a href="https://www.liip.ch/en/blog/zoo-pokedex-part-2-hands-on-with-keras-and-resnet50">will be able to recognize</a> <a href="https://www.liip.ch/en/blog/poke-zoo-or-making-deep-learning-tell-oryxes-apart-from-lamas-in-a-zoo-part-1-the-idea-and-concepts">animals in a zoo</a> and working on another small fun game that lets two people doodle against each other: You will be given a task, as in “draw an apple” and the person who draws the apple faster in such a way that it is recognised by the deep learning model wins. </p>
<p>Beyond such fun innovation projects the possibilities are endless, but always depend on the context of the business and the users. Obviously the saying “if you have a hammer every problem looks like a nail to you” applies here too, not every app will benefit from having computer vision on board, and not all apps using computer vision are <a href="https://www.theverge.com/2017/6/26/15876006/hot-dog-app-android-silicon-valley">useful ones</a> as some of you might know from the famous Silicon Valley episode. </p>
<p>Yet there are quite a few nice examples of apps that use computer vision successfully: </p>
<ul>
<li><a href="http://leafsnap.com">Leafsnap</a>, lets you distinguish different types of leafs. </li>
<li><a href="https://www.aipoly.com">Aipoly</a> helps visually impaired people to explore the world.</li>
<li><a href="http://www.snooth.com/iphone-app/">Snooth</a> gets you more infos on your wine by taking a picture of the label.</li>
<li><a href="https://www.theverge.com/2017/2/8/14549798/pinterest-lens-visual-discovery-shazam">Pinterest</a> has launched a visual search that allows you to search for pins that match the product that you captured with your phone. </li>
<li><a href="http://www.caloriemama.ai">Caloriemama</a> lets you snap a picture of your food and tells you how many calories it has. </li>
</ul>
<p>As usual the code that you have seen in this blogpost is <a href="https://github.com/plotti/mnist-to-coreml">available online</a>. Feel free to experiment with it. I am looking forward to your comments and I hope you enjoyed the journey.  P.S. I would like to thank Stefanie Taepke for  proof reading and for her helpful comments which made this post more readable.</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/d6f619/p1013593.jpg" length="5538521" type="image/jpeg" />
          </item>
        <item>
      <title>Add syntactic sugar to your Android Preferences</title>
      <link>https://www.liip.ch/de/blog/syntactic-sugar-android-preferences-kotlin</link>
      <guid>https://www.liip.ch/de/blog/syntactic-sugar-android-preferences-kotlin</guid>
      <pubDate>Tue, 09 Oct 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<h2>TL;DR</h2>
<p>You can find <a href="https://github.com/liip/SweetPreferences">SweetPreferences on Github</a>.</p>
<pre><code class="language-kotlin">// Define a class that will hold the preferences
class UserPreferences(sweetPreferences: SweetPreferences) {
    // Default key is "counter"
    // Default value is "0"
    var counter: Int by sweetPreferences.delegate(0)

    // Key is hardcoded to "usernameKey"
    // Default value is "James"
    var username: String? by sweetPreferences.delegate("James", "usernameKey") 
}

// Obtain a SweetPreferences instance with default SharedPreferences
val sweetPreferences = SweetPreferences.Builder().withDefaultSharedPreferences(context).build()

// Build a UserPreferences instance
val preferences = UserPreferences(sweetPreferences)

// Use the preferences in a type-safe manner
preference.username = "John Doe"
preference.counter = 34</code></pre>
<h2>Kotlin magic</h2>
<p>The most important part of the library is to define properties that run code instead of just holding a value.</p>
<p>From the example above, when you do:</p>
<pre><code class="language-kotlin">val name = preference.username</code></pre>
<p>what really happening is:</p>
<pre><code class="language-kotlin">val name = sweetPreferences.get("username", "James", String::class)</code></pre>
<p>The <em>username</em> property is converted from a property name to a string, the <code>"James"</code> string is taken from the property definition and the <code>String</code> class is automatically inferred. </p>
<p>To write this simple library, we used constructs offered by Kotlin such as <a href="https://kotlinlang.org/docs/reference/inline-functions.html">Inline Functions</a>, <a href="https://kotlinlang.org/docs/reference/inline-functions.html#reified-type-parameters">Reified type parameters</a>, <a href="https://kotlinlang.org/docs/reference/delegated-properties.html">Delegated Properties</a>, <a href="https://kotlinlang.org/docs/reference/extensions.html#extension-functions">Extension Functions</a> and <a href="https://kotlinlang.org/docs/reference/lambdas.html#function-literals-with-receiver">Function literals with receiver</a>. If you are starting with Kotlin, I warmly encourage you to go check those. It's only a small part of what Kotlin has to offer to ease app development, but already allows you to create great APIs. </p>
<p>Next time you need to store preferences in your Android app, give <a href="https://github.com/liip/SweetPreferences">SweetPreferences</a> a try and share what you have built with it. We’d like to know your feedback!</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/7d1c08/blogpost-sugar.jpg" length="1447202" type="image/jpeg" />
          </item>
        <item>
      <title>Why and how we use Xamarin</title>
      <link>https://www.liip.ch/de/blog/why-how-xamarin</link>
      <guid>https://www.liip.ch/de/blog/why-how-xamarin</guid>
      <pubDate>Fri, 01 Jun 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<p>When we start a new project, we always ask ourselves if we should choose Xamarin over a full native solution. I wanted to reflect on past projects and see if it was really worth using Xamarin. </p>
<p>But how do you compare projects? I decided to use line counting. It can seem obvious or simplistic, but the number of shared lines of code will easily show how much work has been done once instead of twice. I took the two most recent Xamarin projects that we worked on <a href="https://ski.ticketcorner.ch/campaign/ski-app">Ticketcorner Ski</a> and <a href="https://www.together-in-switzerland.ch">together</a>.</p>
<p>I used the following method:</p>
<ul>
<li>Use the well-known <a href="https://github.com/AlDanial/cloc/">cloc</a> tool to count the number of lines in a project.</li>
<li>Count only C# files.
<ul>
<li>Other types such as json configuration files or API response mocks in unit tests do not matter.</li>
</ul></li>
<li>Make an exception with Android layout files.
<ul>
<li>Our iOS layout is all done in Auto Layout code and we don't use Xcode Interface Builder.</li>
<li>To have a fair comparison, I included the Android XML files in the line count.</li>
</ul></li>
<li>Do not count auto-generated files.</li>
<li>Do not count blank lines and comments.</li>
<li>Other tools like <a href="https://fastlane.tools/">Fastlane</a> are also shared, but are not taken into account here.</li>
</ul>
<p>If you want to try with one of your one project, here are the commands I used for the C# files:</p>
<pre><code class="language-bash">cloc --include-lang="C#" --not-match-f="(([Dd]esigner)|(AssemblyInfo))\.cs" .</code></pre>
<p>For the Android XML layouts, I used:</p>
<pre><code class="language-bash">cloc  --include-lang="xml" --force-lang="xml",axml Together.Android/Resources/layout</code></pre>
<h2>Here is what I found:</h2>
<table>
<thead>
<tr>
<th style="text-align: left;">Project</th>
<th style="text-align: center;">Android</th>
<th style="text-align: center;">iOS</th>
<th style="text-align: center;">Shared</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left;">Ticketcorner Ski</td>
<td style="text-align: center;">31%</td>
<td style="text-align: center;">31%</td>
<td style="text-align: center;"><strong>38%</strong></td>
</tr>
<tr>
<td style="text-align: left;">together</td>
<td style="text-align: center;">42%</td>
<td style="text-align: center;">30%</td>
<td style="text-align: center;"><strong>28%</strong></td>
</tr>
</tbody>
</table>
<p>We can see that on those projects, an average of one third of code can be shared. I was pretty impressed to see that for Ticketcorner Ski we have the same number of lines on the two platforms. I was also pleasantly surprised to see that the project almost <strong>shares 40% of its code</strong>.</p>
<p>In a mobile app, most of the unit tests target the business logic, which is exactly what is shared with Xamarin: business logic and their unit tests are only written once. Most libraries not directly related to business logic are also shared: REST client, Database client, etc...</p>
<p>The code that is not shared is mostly UI code, interaction, etc... But it is also platform specific code: how to access the camera, how to handle push notifications, how to securely store user credentials according to each platform's guidelines.</p>
<p>It would not be fair to conclude that doing those projects in native would have been 30% more expensive. The shared code has sometimes to take into account that it will be used on two different platforms, and it gets more generic than it would be if written twice.</p>
<h2>So... how do you choose one or the other ?</h2>
<p>My goal with this blogpost is not to start a flame war on whether Xamarin is good or bad. I have shown here that for those projects, it was the right choice to use Xamarin. I want to share a few things we think about when we have to make a decision. Note that we use Xamarin.iOS and Xamarin.Android, but don't use Xamarin.Forms.</p>
<ul>
<li>Does the application contain a lot of business logic, or is it more UI-based?
<ul>
<li>With one Xamarin project we worked on in the past year, a specific (and complex) use-case was overlooked by the client and it resulted in paying users being pretty unhappy. We were very pleased to be able to fix the problem once, and write the related unit tests once too.</li>
<li>As a counterexample, for the <a href="https://www.zoo.ch/de/apps">Zürich Zoo app</a>, most of our job was writing UX/UI code. The business logic is solely doing GET requests to a backend.</li>
</ul></li>
<li>Do you plan to use external libraries/SDKs?
<ul>
<li>Xamarin is pretty good at <a href="https://docs.microsoft.com/en-us/xamarin/android/platform/binding-java-library/binding-a-jar">using .jar files on Android</a>.</li>
<li>Native libraries on iOS <a href="https://docs.microsoft.com/en-us/xamarin/cross-platform/macios/binding/objective-sharpie/">have to be processed manually</a> and it can be tedious to do. It is also hard to use a library packaged with CocoaPods that depends on many other pods.</li>
<li>For both platforms, We encountered closed-source tools that are not that easy to convert. As an example, we could use the <a href="https://www.datatrans.ch/en/technics/payment-apis/in-app-payment-libraries">Datatrans SDK</a>, but not without some <em>trial and error</em>.</li>
<li>There are however other Xamarin libraries that can replace what you are used to when developping on both platforms. We replace <a href="http://square.github.io/picasso/">Picasso</a> on Android and <a href="https://github.com/onevcat/Kingfisher">Kingfisher</a> on iOS by <a href="https://github.com/luberda-molinet/FFImageLoading">FFImageLoading</a> on Xamarin. This library has the same API methods on both platforms which makes it easy to use.</li>
</ul></li>
<li>Do you plan to use platform-specific features?
<ul>
<li>Xamarin is able to provide access to every platform feature, and it works well. It is also known that they update the Xamarin SDKs as soon as new iOS/Android versions are announced.</li>
<li>For <a href="https://www.liip.ch/x/cy7o8c">Urban Connect</a> however, the most important part of the app is using <em>Bluetooth Low Energy</em> to connect to bike locks. Even if Xamarin is able to do it too, it was the right decision to remove this extra layer and code everything natively.</li>
</ul></li>
<li>Tooling, state of the platform ecosystems:
<ul>
<li>In the mobile world, things move really fast:
<ul>
<li>Microsoft pushes really hard for developers to adopt Xamarin, for example with <a href="https://appcenter.ms/">App Center</a>, the new solution to build, test, release, and monitor apps. But Visual Studio for Mac is still really buggy and slow.</li>
<li>Google added first-class support for <a href="https://www.liip.ch/en/blog/kotlin-why-i-will-never-go-back-to-java">Kotlin</a>, has an awesome IDE and pushes mobile development with platforms like Firebase or Android Jetpack.</li>
<li>Apple follows along, but still somehow fails to improve Xcode and its tooling in a meaningful manner.</li>
</ul></li>
<li><strong>Choices made one day will certainly not be valid one year later.</strong></li>
</ul></li>
<li>Personal preferences:
<ul>
<li>Inside Liip there are very divergent opinions about Xamarin. We always choose the right tool for the job. Having someone efficient and motivated about a tool is important too.</li>
</ul></li>
</ul>
<p>I hope I was able to share my view on why and how we use Xamarin here at Liip. I personally enjoy working both on Xamarin or native projects. Have a look at <a href="https://www.together-in-switzerland.ch">together</a> and <a href="https://ski.ticketcorner.ch/campaign/ski-app">Ticketcorner Ski</a> and tell us what you think!</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/c03533/2000px-xamarin-logo-svg.jpg" length="48001" type="image/png" />
          </item>
        <item>
      <title>One.Thing.Less goes live!</title>
      <link>https://www.liip.ch/de/blog/one-thing-less-goes-live</link>
      <guid>https://www.liip.ch/de/blog/one-thing-less-goes-live</guid>
      <pubDate>Fri, 25 May 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<h2>Making privacy and data protection actionable by anyone</h2>
<p>You surely received many companies’ emails about your data privacy during last weeks. That’s due to the new European law (called GDPR for General Data Protection Regulation) that became enforceable on the 25th of May 2018.<br />
This is a major step towards a more responsible usage of personal data by companies, as you can now request information on how your data is being used, and ask to change it.</p>
<p>Nevertheless, as James Aschberger (Founder &amp; CEO of One.Thing.Less AG) noticed, many people didn’t understand how they can concretely claim their rights using this new regulation.</p>
<p><em>&quot;It’s overwhelming, where do I even start?&quot;</em></p>
<p><em>&quot;I am worried about my data, but I don’t have the time or knowledge to contact companies individually or read through all the privacy policies and terms of use they put in front of me.&quot;</em></p>
<p><em>&quot;I receive so many messages regarding data privacy, but I don’t understand all of it and don’t have time to read the fine print.&quot;</em></p>
<h2>There is an app for that now</h2>
<p>We at Liip have <a href="https://www.liip.ch/en/blog/one-thing-less-ag-unites-with-liip-to-launch-its-mobile-app">worked hard</a> during the past four months to make the One.Thing.Less dream of James happen — a free and simple way to take control over the use of your personal data.</p>
<p>You can now <a href="https://itunes.apple.com/app/one-thing-less/id1353068746">download the app on the Apple App Store</a> (and in June on Google Play for Android) and make your voice heard by companies.</p>
<figure><img src="https://liip.rokka.io/www_resize/resize-width-300/030ec3/onethingless-mobile-app-homescreen.jpg" alt="One.Thing.Less mobile app Homescreen" width="300"></figure>
<p><em>One.Thing.Less mobile app Homescreen</em></p>
<h2>Keep It Simple, Stupid (aka KISS)</h2>
<p>When James first came to us, he made it clear that the User Experience had to be simple. That was one of our main challenge: solve this complex legal problem via one click.</p>
<p>This is what we did.</p>
<figure><img src="https://liip.rokka.io/www_resize/resize-width-300/df49da/onethingless-mobile-app-ask-companies.jpg" alt="Ask companies about the use of your personal data, in one tap." width="300"></figure>
<p><em>Ask companies about the use of your personal data, in one tap.</em></p>
<p>On the above view, you can just press the “Send request” to ask a company seven (simple) questions about the use of your personal data. Companies have to provide a response within 30 days under GDPR.</p>
<p>Then you can request them to change how they make use of your personal data, still one click away.</p>
<figure><img src="https://liip.rokka.io/www_resize/resize-width-700/526751/onethingless-mobile-app-act-preview.jpg" alt="Change the way companies use your personal data." width="700"></figure>
<p><em>Change the way companies use your personal data</em></p>
<h2>A trustful partnership</h2>
<p>We can’t stop talking about trust here at Liip, as we believe that’s one of the core element of successful products and companies.<br />
This project was a great example of how one of our mottos “Trust over control” can lead to extraordinary results, in less than four months.</p>
<p><em>&quot;Successfully launching our platform in such a short time was only possible because we established and earned mutual trust along the entire journey together. If I had not been able to trust that each of the Liip team members delivers, we would have failed.&quot;</em> James Aschberger, Founder &amp; CEO at One.Thing.Less AG</p>
<p><em>&quot;It was a chance to craft such a meaningful product, in a trustful and solution-oriented partnership.&quot;</em> Thomas Botton, Product Owner at Liip</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/7c3560/onethingless-mobile-app-launch.jpg" length="663436" type="image/png" />
          </item>
        <item>
      <title>Progressive web apps, Meteor, Azure and the Data science stack or The future of web development conference.</title>
      <link>https://www.liip.ch/de/blog/progressive-web-apps-meteor-azure-and-the-data-science-stack-or-the-future-of-web-development-conference</link>
      <guid>https://www.liip.ch/de/blog/progressive-web-apps-meteor-azure-and-the-data-science-stack-or-the-future-of-web-development-conference</guid>
      <pubDate>Wed, 09 May 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<h3>Back to the future</h3>
<p>Although the conference (hosted in Zürich last week in the Crown Plaza) had explicitly the word future in the title, I found that often the new trends felt a bit like &quot;back to the future&quot;. Why ? Because it seems that some rather old concepts like plain old SQL, &quot;offline first&quot; or pure javascript frameworks or are making a comeback in web development - but with a twist.  This already  brings us to the first talk. </p>
<h3>Modern single page apps with meteor</h3>
<figure><img src="https://liip.rokka.io/www_inarticle/ce3196/meteor.png" alt=""></figure>
<p>Timo Horstschaefer from <a href="https://www.ledgy.com">Ledgy</a> showed how to create modern single page apps with <a href="https://www.meteor.com">meteor.js</a>. Although every framework promises to &quot;ship more with less code&quot;, he showed that for their project Ledgy - which is a mobile app to allocate shares among stakeholders - they were able to actually write it in less than 3 months using 13'000 lines of code. In comparison to other web frameworks where there is a backend side, that is written in one language (e.g. ruby - rails, python - django etc..) and a js-heavy frontend framework (e.g. react or angular) meteor does things differently by also offering a tightly coupled frontend and a backend part written purely in js. The backend is mostly a node component. In their case it is really slim, by only having 500 lines of code. It is mainly responsible for data consistency and authentication, while all the other logic simply runs in the client. Such client projects really shine especially when having to deal with shaky Internet connections, because meteor takes care of all the data transmission in the backend, and catches up on the changes once it has regained accessibility. Although meteor seemed to have had a rough patch in the community in 2015 and 2016 it is heading for a strong come back. The framework is highly opinionated, but I personally really liked the high abstraction level, which seemed to allow the team a blazingly fast time to market. A quite favorable development seems to be that Meteor is trying to open up beyond MongoDB as a database by offering their own GraphQL client (Apollo) that even outshines Facebook's own client, and so offers developers freedom on the choice of a database solution.</p>
<p>I highly encourage you to have a look at Timo's <a href="http://mypage.netlive.ch/demandit/files/M_D0861CC4DCEF62DFADC/dms/File/Moderne%20Single%20Page-Apps%20mit%20Meteor%20_%20Timo.pdf">presentation.</a> </p>
<h3>The data science stack</h3>
<figure><img src="https://liip.rokka.io/www_inarticle/8b4877/datastack.png" alt=""></figure>
<p>Then it was my turn to present the data science stack. I won't bother you about the contents of my talk, since I've already blogged about it in detail <a href="https://www.liip.ch/en/blog/the-data-science-stack-2018">here</a>. If you still want to  have a look at the presentation, you can <a href="http://mypage.netlive.ch/demandit/files/M_D0861CC4DCEF62DFADC/dms/File/Liip%20Data%20Stack.pdf">download</a> it of course. In the talk offered a very subjective birds eyes view on how the data centric perspective touches modern web standards. An interesting feedback from the panel was the question if such an overview really helps our developers to create better solutions. I personally think that having such maps or collections for orientation helps especially people in junior positions to expand their field of view. I think it might also help senior staff to look beyond their comfort zone, and overcome the saying &quot;if everything you have is a hammer, then every problem looks like a nail to you&quot; - so using the same set of tools for every project. Yet I think the biggest benefit might be to offer the client a really unbiased perspective on his options, of which he might have many more than some big vendors are trying to make him believe. </p>
<h3>From data science stack to data stack</h3>
<figure><img src="https://liip.rokka.io/www_inarticle/ed727f/azure.png" alt=""></figure>
<p>Meinrad Weiss from Microsoft offered interesting insights into a glimpse of the Azure universe, showing us the many options on how data can be stored in an azure cloud. While some facts were indeed surprising, for example Microsoft being unable to find two data centers that were more than 400 miles apart in Switzerland (apparently the country is too small!) other facts like the majority of clients still operating in the SQL paradigm were less surprising. One thing that really amazed me was their &quot;really big&quot; storage solution so basically everything beyond 40 peta!-bytes: The data is spread into 60! storage blobs that operate independently of the computational resources, which can be scaled for demand on top of the data layer. In comparison to a classical hadoop stack where the computation and the data are baked into one node, here the customer can scale up his computational power temporarily and then scale it down after he has finished his computations, so saving a bit of money. In regards to the bill though such solutions are not cheap - we are talking about roughly 5 digits per month entrance price, so not really the typical KMU scenario. Have a look at the <a href="http://mypage.netlive.ch/demandit/files/M_D0861CC4DCEF62DFADC/dms/File/AzureAndData_Meinrad_Microsoft.pdf">presentation</a> if you want a quick refresher on current options for big data storage at Microsoft Azure. An interesting insight was also that while a lot of different paradigms have emerged in the last years, Microsoft managed to include them all (e.g. Gremlin Graph, Cassandra, MongoDB) in their database services unifying their interfaces in one SQL endpoint. </p>
<h3>Offline First or progressive web apps</h3>
<figure><img src="https://liip.rokka.io/www_inarticle/7a4898/pwa.png" alt=""></figure>
<p>Nicro Martin, a leading Web and Frontend Developer from the <a href="https://sayhelloagency.com">Say Hello</a> agency showcased how the web is coming back to mobile again. Coming back? Yes you heard right. If thought you were doing mobile first for many years now, you are right to ask why it is coming back. As it turns out (according to a recent comscore report from 2017) although people are indeed using their mobile heavily, they are spending 87% of their time inside apps and not browsing the web. Which might be surprising. On the other hand though, while apps seem to dominate the mobile usage, more than 50% of people don't install any new apps on their phone, simply because they are happy with the ones the have. Actually they spend 80% of their time in the top 3 apps. That poses a really difficult problem for new apps - how can they get their foot into the door with such a highly habitualized behavior. One potential answer might be <a href="https://developers.google.com/web/progressive-web-apps/">Progressive Web apps</a>, a standard defined by Apple and Google already quite a few years ago, that seeks to offer a highly responsive and fast website behavior that feels almost like an application. To pull this off, the main idea is that a so called &quot;service worker&quot; - a piece of code that is installed on the mobile and continues running in the background - is making it possible for these web apps  to for example send notifications to users while she is not using the website. So rather something that users know from their classical native apps. Another very trivial benefit is that you can install these apps on your home screen, and by tapping them it feels like really using an app and not browsing a website (e.g. there is no browser address bar). Finally the whole website can operate in offline mode too, thanks to a smart caching mechanism, that allows developers to decide what to store on the mobile in contrast to what the browser cache normally does. If you feel like trying out one of these apps I highly recommend to try out <a href="http://mobile.twitter.com">mobile.twitter.com</a>, where Google and Twitter sat together and tried to showcase everything that is possible with this new technology. If you are using an Android phone, these apps should work right away, but if you are using an Apple phone make sure to at least have the most recent update 11.3 that finally supports progressive apps for apple devices. While Apple slightly opened the door to PWAs I fear that their lack of support for the major features might have something to do with politics. After all, developers circumventing the app store and interacting with their customers without an intermediary doesn’t leave much love for Apples beloved app store.  Have a look at Martin's great <a href="https://slides.nicomartin.ch/pwa-internet-briefing.html">presentation</a> here. </p>
<h3>Conclusion</h3>
<p>Overall although the topics were a bit diverse, but I definitely enjoyed the conference. A big thanks goes to the organizers of <a href="http://internet-briefing.ch">Internet Briefing series</a> who do an amazing job of constantly organizing those conferences in a monthly fashing. These are definitely a good way to exchange best practices and eventually learn something new. For me it was the motivation to finally get my hands dirty with progressive web apps, knowing that you don't really need much to make these work.  </p>
<p>As usual I am happy to hear your comments on these topics and hope that you enjoyed that little summary.</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/25d9a4/abstract-art-colorful-942317.jpg" length="1981006" type="image/jpeg" />
          </item>
        <item>
      <title>One.Thing.Less AG entrusts Liip with the launch of its mobile app</title>
      <link>https://www.liip.ch/de/blog/one-thing-less-ag-unites-with-liip-to-launch-its-mobile-app</link>
      <guid>https://www.liip.ch/de/blog/one-thing-less-ag-unites-with-liip-to-launch-its-mobile-app</guid>
      <pubDate>Wed, 02 May 2018 00:00:00 +0200</pubDate>
      <description><![CDATA[<p>One.Thing.Less AG will empower and enable individuals around the world to regain control over the use of their personal data in an easy and secure way. We currently help the startup to craft its mobile app and platform so it’s ready for the launch day on the 25th of May 2018.</p>
<h2>From idea to product</h2>
<p>This product idea comes from James Aschberger, Founder &amp; CEO of One.Thing.Less AG, who saw the opportunity in the new <a href="https://en.wikipedia.org/wiki/General_Data_Protection_Regulation">GDPR regulation</a> to bring back balance in the personal data “crisis” that our world currently faces. Their mobile app will allow you — in one-tap — to ask for visibility on what some companies such as Facebook or Google or Starbucks have and do with your data. More importantly, once they answered, you will be able to act and request changes on how they deal with your personal data. The goal of the product is that you are in control of the use of your personal data.<br />
I loved to hear such a pitch, and so did my mobile teammates. We hence kickstarted our collaboration back in January.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/f0449e/liip-blog-otl-launch2.jpg" alt="One.Thing.Less-Liip team during a User Experience workshop"></figure>
<p><em>One.Thing.Less-Liip team during a User Experience workshop.</em></p>
<h2>Finding a strong mobile partner, with the right (startup) mindset</h2>
<p>One.Thing.Less’ challenge was their lack of an internal technical team. We solved their issue by putting up a cross-functional team tailored to their needs — composed of mobile and backend developers, as well as designers.</p>
<p>On top of this, and that’s maybe the most critical of all, we answered the need of James to find a trustworthy team with people having the right startup mindset:</p>
<p><em>“Once our idea was clear, the biggest challenge was to identify the right UX and technical team to bring it to life. Being a small startup, the chemistry needs to be right and the team mindset needs to be diverse. After our first meeting with Liip I had a strong feeling about the team and their know-how. It was after the second meeting that we were totally convinced that they not only have the technical expertise but also the right spirit and determination to bring this idea into a tangible product.”</em> — James Aschberger, Founder &amp; CEO of One.Thing.Less AG</p>
<p>We at Liip are all entrepreneurs, and the way we are <a href="https://www.liip.ch/en/blog/self-organization">self-organized</a> allows us to act as such on a daily basis. This reassured and convinced the CEO of One.Thing.Less that Liip was the right choice.<br />
It’s been only three months that we work together, but we already feel like we are one unique team, with one common goal — that is to launch a product that will impact lives positively.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/3d11d3/one-thing-less-mobile-development.jpg" alt="Development of the One.Thing.Less mobile app and platform"></figure>
<p><em>Development of the One.Thing.Less mobile app and platform.</em></p>
<h2>Data, public or private, should be controlled by whom it belongs to</h2>
<p>If you follow us since a while, you surely know that the data topic is part of our DNA. From <a href="https://www.liip.ch/en/work/api">API</a> to allow interoperability of IT systems, to <a href="https://www.liip.ch/en/work/open-data">Open Data</a> to give back transparency to citizens, we were always involved in such domains and will remain for the long run. This collaboration with One.Thing.Less confirms it.</p>
<p>We can’t wait to put this product into your hands on the 25th of May. Stay tuned and sign up for a launch notification at <a href="https://www.onethingless.com">www.onethingless.com</a></p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/bad153/liip-blog-otl-launch1.jpg" length="353028" type="image/jpeg" />
          </item>
        <item>
      <title>Meet Kotlin &#8212; or why I will never go back to Java!</title>
      <link>https://www.liip.ch/de/blog/kotlin-why-i-will-never-go-back-to-java</link>
      <guid>https://www.liip.ch/de/blog/kotlin-why-i-will-never-go-back-to-java</guid>
      <pubDate>Fri, 09 Mar 2018 00:00:00 +0100</pubDate>
      <description><![CDATA[<p>When Google and JetBrains <a href="https://blog.jetbrains.com/kotlin/2017/05/kotlin-on-android-now-official/">announced</a> first-class support for Kotlin on Android last year, I could not wait to use it on our next project. Java is an OK language, but when you are used to Swift on iOS and C# on Xamarin, it's sometimes hard to go back to the limited Java that Android has to offer.</p>
<p>Within this past year, we successfully shipped two applications using Kotlin exclusively, with another one to follow soon. We decided to also use Kotlin for previous Java apps that we keep updating.</p>
<p>I took my chance when the <a href="https://www.meetup.com/Mobile-Romandie-Beer">Mobile Romandie Beer</a> meetup was looking for speakers. I knew that I had to show others how easy and fun this language is. </p>
<p>It turned out great. We had people from various backgrounds: from people just curious about it, UX/UI designers, iOS developers, Java developers to people using Kotlin in production already.</p>
<p>You can find my slides below:</p>
<script async class="speakerdeck-embed" data-id="1504188547254400bb81fd9f30f2e701" data-ratio="1.77777777777778" src="//speakerdeck.com/assets/embed.js"></script>
<p>I would like to share a few links that helped me learn about Kotlin:</p>
<ul>
<li><a href="https://try.kotlinlang.org">Kotlin Koans</a>: a step by step tutorial that directly executes your code in the browser</li>
<li><a href="https://developer.android.com/kotlin/">Kotlin and Android</a>: the official Android page to get started with Kotlin on Android</li>
<li><a href="https://github.com/android/android-ktx">Android KTX</a>: a useful library to help with Android development, released by Google</li>
</ul>
<p>See you at the <a href="https://www.meetup.com/Mobile-Romandie-Beer/events/246118655/">next meetup</a>!</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/cb1695/meetup.jpg" length="366348" type="image/jpeg" />
          </item>
        <item>
      <title>Urban Connect Mobile App Golive (And The Challenges We Faced)</title>
      <link>https://www.liip.ch/de/blog/urban-connect-mobile-app-golive-and-the-challenges-we-faced</link>
      <guid>https://www.liip.ch/de/blog/urban-connect-mobile-app-golive-and-the-challenges-we-faced</guid>
      <pubDate>Thu, 01 Feb 2018 00:00:00 +0100</pubDate>
      <description><![CDATA[<p>Back in October, we kickstarted the relaunch of the Urban Connect bicycle fleet solution for their corporate clients (amongst which are Google and Avaloq). We provided the startup with User Experience, design, and mobile development services. The goal was to launch a brand new iOS and Android mobile app, as well as a new back-office and API. This new solution will enable the startup to grow its activity based on a robust and scalable digital platform.</p>
<p>The golive happened successfully last week — yeah!, and while we prepare our celebration lunch, I thought it is maybe interesting to share the challenges we faced during this purposeful project, and how we overcame them.</p>
<h2>Make The User Use The App The Least Possible!</h2>
<p>This product development was particular because we soon realized that for once, we were crafting a mobile app that should be used the least possible. That’s for a goal when all what we hear nowadays is about user retention and engagement.<br />
The point is that Urban Connect service is about providing a bike solution with a smart lock that can be opened with your smartphone (via Bluetooth technology). People want to use it to book a bike and go from a point A to a point B, with the least friction possible — including with our mobile app software.</p>
<p>This key discovery was possible thanks to our UX workshops that focus on the user problems first. Concretely, it means that we went to the Google Zürich office to analyze and listen to how real users were using the solution, and which problems they had. That’s when we got that users wanted to use the app the least possible, and that it works automagically without getting their smartphone out of their pocket.<br />
It’s only afterwards that we started to draw the first wireframes, and iterated on prototypes with Urban Connect and its clients to be sure that what we’re going to build was answering the real issues.<br />
And finally, we developed and applied the user interface design layers.</p>
<p>This resulted in one of the Google users stating:</p>
<p><em><strong>“Wow, that’s like a Tesla!<br />
I never use the key, but it’s always there with me, and it just works.”</strong></em></p>
<p>Again, we looked at the problems first, not at the design aspects. It may sound simple, but it still isn’t such a mainstream approach as one could think.</p>
<h2>On Third-Party Smart Locks, Fancy Android Phones, and MVP</h2>
<p>Most of the struggles we had to overcome were linked to the Bluetooth connectivity between the smart locks from Noke, and the Chinese Android devices.</p>
<p>The issues we faced with the Noke Smart Lock solution were that the company is still a startup, and as such, there are still some perks to their product such as outdated documentation, or hardware reliability.<br />
Nevertheless, our solution was to not enter in a blame game party with them and Urban Connect, and rather contact them directly to fix the problems, one at a time. We must say that they were really responsive and helpful, so much that all of our main blockers were removed in less than a few days each time.<br />
But that’s something to take into account in the planning and investment margin when you do estimation — thankfully we did!</p>
<figure><img src="https://liip.rokka.io/www_inarticle/294682/urban-connect-mobile-app-development.jpg" alt="Tests of our Urban Connect Mobile App Developments"></figure>
<p><em>Testing Session of our Urban Connect Mobile App.</em></p>
<p>As for the fancy Android phones, that’s the same story one hear everywhere: Android devices’ fragmentation sucks. We’re prepared to face such issues, and bought many different Android devices to be sure to have a good coverage.<br />
Nevertheless, there are always corner cases such as a fancy Chinese phone with a specific Android ROM — one that is obviously not sold anymore to simplify the thing.<br />
We overcame this issue thanks to a simple tool: communication. We didn’t play the blame game, and got in contact with the end user to figure out what was the problem on his device, so that we could understand it better.</p>
<p>Although communication is a good ally, its effect is the most effective when you start to use it as early as possible in the project.<br />
As we use the Minimum Viable Product approach to develop our products, this allowed us to focus on this critical points upfront from day one, face them, and find solutions — vs. building the entire thing and being stuck at the golive with unsolvable issues.</p>
<h2>Trustful Partnership as a Key to Success</h2>
<p>On top of the usual UX and technical challenges that we face in every project, one key to the success of this product launch was the people involved. And the trustful collaboration between  Urban Connect and us.<br />
We faced many different challenges, from unexpected events, to sweats about planning deadlines (we made it on time, be reassured). And every single time, Judith and her team trusted us, as we did all along the way. Without this trust, we could have failed at many points during the project, but we both chose to remain in a solution-oriented mindset, and focus on the product launch goal. It proved to work, once again.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/e3e56a/urban-connect-trustful-partnership.jpg" alt="A Trustful Partnership as a Key to a Product Success"></figure>
<p><em>A Trustful Partnership as the Key to a Product Success.</em></p>
<h2>What’s next for Urban Connect?</h2>
<p>Now that we happily pressed the “Go Live!” button, we’re first going to celebrate it properly around a nice lunch in Zürich. It was intense, but we love to work on such purposeful products which will help cities get more healthier thanks to less pollution and more physical activity for its users.</p>
<p>Then, we’ll have plenty of features to work on in 2018, including innovative ones like integrating IoT modules to the platform. We’ll make sure to share such new cool stuff with you.</p>
<p>In case you want to learn more about Urban Connect e-bike fleet service, feel free to contact Judith to get more infos and a demo of our solution.</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/3facca/urban-connect-golive.jpg" length="532536" type="image/jpeg" />
          </item>
        <item>
      <title>Going on a ride with Urban Connect</title>
      <link>https://www.liip.ch/de/blog/going-on-a-ride-with-urban-connect</link>
      <guid>https://www.liip.ch/de/blog/going-on-a-ride-with-urban-connect</guid>
      <pubDate>Fri, 03 Nov 2017 00:00:00 +0100</pubDate>
      <description><![CDATA[<p>Urban Connect provides complete bicycle fleet solutions for corporate clients. For example Google and Avaloq trust Urban Connect to provide fleets of e-bikes for their employees who need to move around the city. In this mobility environment, the user experience and stability of the software solution are crucial.</p>
<h2>A successful startup</h2>
<p>Judith Häberli founded Urban Connect in 2013. As CEO, her first concern was to test the market as soon as possible with a real product. For this reason, the team developed its initial minimum viable product. About a year later, the startup found its market and matured into an established enterprise in the mobility market. Today the proof-of-concept is achieved. Optimizing the user experience, improving the system stability, and having a better software evolutivity are the next challenges.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/8579514aad66211748d3ef4a44758363aa98c579/urbanconnectteam.jpg" alt="Urban Connect Team"></figure>
<p><em>On Urban Connect office’s terrace in Zürich for our partnership kick-off (from left to right): Judith (UC), Noé &amp; Darja (Liip), Luting (UC), Jonas &amp; Thomas (Liip) and Robert (UC).</em></p>
<h2>Towards a robust and scalable software base</h2>
<p>In order to ensure the next maturity level of the company, Urban Connect asked us to partner on the development of their product: new iOS and Android mobile applications connected to a hardware device, a new Fleet Management System, and a new API. The deadline is tight, in order to keep their customers satisfied. At Liip, we will put a strong emphasis on a flawless user journey, backed-up by a robust technical implementation. It will provide Urban Connect with the basis needed to grow during the upcoming years.</p>
<h2>Our commitment to delight Urban Connect users</h2>
<p>Our UX process will support Judith and her team to craft a new solution that is designed around the users. We will then develop it iteratively in order to receive user feedbacks as early as possible.<br />
We are excited to start this new partnership as it is exactly what we love. Indeed, we like to be fully involved from the start, and to work in an entrepreneurship spirit.</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/781ba753ae1422dc05b795e45929c25a1f4c62c5/mig-2301.jpg" length="708254" type="image/jpeg" />
          </item>
        <item>
      <title>Houston: a mobile app to replace a radio communication system</title>
      <link>https://www.liip.ch/de/blog/houston-replaces-radio-system</link>
      <guid>https://www.liip.ch/de/blog/houston-replaces-radio-system</guid>
      <pubDate>Mon, 07 Aug 2017 00:00:00 +0200</pubDate>
      <description><![CDATA[<p><em>Bring your company radio system to the 21st century using VoIP and mobile applications to improve communication quality while reducing costs.</em></p>
<p>With the project <a href="https://www.liip.ch/en/blog/tpf-communication-system-for-buses">Houston</a>, we took the challenge of replacing the old radio network of the <a href="http://www.tpf.ch/">Transports Publics Fribourgeois</a> (TPF), a swiss public transportation company by a system using existing data network and running on mobile applications. This solution solved the problem of maintaining a dedicated radio network. It also improved both the global quality of the communication and the availability of the system.</p>
<h1>Initial situation: communication based on radio system</h1>
<p>Since decades, employees of the Transports Publics Fribourgeois (TPF) have been using standard radio to communicate between them. The radio system is meant to cover the needs of the users. It is spread over more than 200 busses, 30 team leaders and the operation center). There are three types of users, with specific needs:</p>
<ul>
<li>The <strong>operators</strong>  – working in the operation center – use the radio to speak to a specific bus driver, or to broadcast messages to all or part of the running busses.</li>
<li>The <strong>team leaders</strong>  are dispatched at different locations. They use the radio to manage daily events – such as the replacement of a driver – or to inform many drivers of a change in the network – for example in case of an accident.</li>
<li>The <strong>bus drivers</strong>  use the bus radio as the main means of communication while driving. They can call other busses, the team leaders or the operation center.</li>
</ul>
<figure><img src="https://liip.rokka.io/www_inarticle/801dcf37076028e232418077319ced469d88632b/140520-mc-09993-1024x768.jpg" alt="Logo TPF"></figure>
<h1>The project: replace the old radio system</h1>
<p>The radio system used by the employees of the TPF had some drawbacks : </p>
<ul>
<li>The poor quality of the communication,</li>
<li>The low coverage of the network (about 60% of the territory. This is because it is too hard to cover the all territory since busses drive all over the countryside),</li>
<li>The exploitation and maintenance cost of the a radio network were expensive (antennas were installed in strategic locations, all over the canton).</li>
</ul>
<p>The goal of the project Houston was to improve the former system by replacing the radio communication system. </p>
<p>The challenge was to provide at least all already existing features and possibly add some new one enabled by the geo localisation of the busses. The possibility to call all busses currently in a given region was one of the new requested feature. </p>
<center><figure><img src="https://liip.rokka.io/www_inarticle/39d9ca1ad306d89abca5901d222e43c56989a76d/screenshot-2016-09-08-15-19-23-169x300.jpg" alt="Screenshot_Houston"></figure>

Screenshot of the app Houston</center>
<h1>The solution: a mobile application</h1>
<p>A mobile app would allow the system to modern technologies. With a mobile app, we also expected to drastically reduce the exploitation cost. Indeed, with a mobile app, it is not necessary to install and maintain radio antennas all over the Fribourg region.</p>
<p>We decided to create a mobile app, because they are easy to develop in comparison with dedicated embedded devices. The cost of 200 smartphones is more than justified by the flexibility and development cost that this solution offers.</p>
<figure><img src="https://liip.rokka.io/www_inarticle/86cc0e95d35b72d91b969d4c2a3fac405faac780/160827-mc-0003-1024x683.jpg" alt="Public busses TPF"></figure>
<h1>We started by building trust</h1>
<p>The way we worked hand in hand with the client had a huge impact on the success of the project. From the beginning, we asked all stakeholders and end users to explain us how they worked together with the former system. We asked what the strengths and the weaknesses of the system were. We asked them how they would like to use it.</p>
<p>As a result, we had a good idea of what the main features to implement were. Most importantly, we also gained the trust of the end users. </p>
<p>The fact that there was such a level of trust, was a good basis to develop the project. We were not hiding problems, but we were exposing them directly to find the best solution possible together. This collaboration mindset helped a lot during the development of the project.</p>
<p><strong>Additional Information</strong> </p>
<p><em>Best of Swiss App 2016: Houston wins Silver in the «Enterprise» category and Gold in the «Innovation» category </em>(<a href="https://www.liip.ch/en/blog/best-of-swiss-apps-2016--liip-takes-three-silver-and-one-gold">read here</a>)</p>
<p><em>HOUSTON: The first VoIP Communication System for Swiss Public Transportation </em>(<a href="https://www.liip.ch/en/blog/tpf-communication-system-for-buses">read here</a>)</p>
<p><em>Meilleur du Web 2016: HOUSTON awarded in the category UX and Technology </em>(<a href="https://www.liip.ch/en/blog/liip-gets-4-awards-at-the-meilleur-du-web">read here</a>)</p>]]></description>
                  <enclosure url="http://liip.rokka.io/www_card_2/71c574424c8c1bcaa796b2c89c70fa6c4ae94fbb/160827-mc-0003.jpg" length="99853" type="image/jpeg" />
          </item>
    
  </channel>
</rss>
