Your Own Image Picker With Flutter Channels

In this tutorial you will learn how to use Flutter Channels to communicate with platform code and create an image picker for both Android and iOS.

5/5 2 Ratings · Leave a Rating


  • Dart 2, Android 4.4, Android Studio 3

Flutter? So you’ve decided to go down the Dart side, eh? Great! Imagine you want to make something more complex than UI and logic. Something that will help you develop that buy-and-sell app that you are making. Wouldn’t it be great if you could allow the user to select photos from their own gallery in your app? Is this possible? Without the use of plugins? (Spoiler alert: Yes!)

In this tutorial, you will:

  • Learn to use Flutter’s Platform Channels.
  • Learn how to access photos from Android and iOS to use them within your Flutter components.
  • Build an image picker in Flutter using Platform Channels.

Don’t change the channel, this article will be right back after these important messages!

Note: This tutorial assumes that you’re already familiar with the basics of Flutter development. If you are new to Flutter, read through our Getting Started With Flutter tutorial. Other pre-requisites include knowledge of using Android Studio with Flutter. Using Xcode is optional, but you must have it installed.

Getting Started

You can download the project files by clicking on the Download Materials button at the top or bottom of the tutorial. Then, open the project up in Android Studio 3.4 or greater. You should be using a recent version of Flutter, 1.5 or above. Be sure to get Flutter dependencies for the project if prompted to do so by Android Studio. You’ll start working with the project after a bit of theory.

Notably, you’ll find the grid user interface for the image picker already provided for you in the starter project.

Flutter and Platform-Specific Code

Flutter provides a lot of functionality on its own. You can create a very pleasing interface together with cool animations. Yet, some functionality might not be available directly in Flutter because it’s platform specific. For example, you might want to use some sensors from your device or access a specific feature on Android or iOS.

In that case, you might first try to find a Flutter plugin/package provided by Google or some other third party. If you find something, you’re in luck. But you might want to understand how they created the plugin. If you don’t find anything, you will have to develop your own solution. You can make your own solutions using Flutter’s Platform Channels.

Platform Channels

Platform Channels allow communication between the Flutter layer of your app and the platform layer, that is, the iOS or Android layer. You use channels by sending messages to and from each end. For example, you can send messages to the channel from Flutter; then, the platform layer reacts to the messages by passing back a result. See the image below for reference.

Method Channels

As shown above, the arrows are two-way. Thus, you can pass messages from the platform layer, too. Similarly, the Flutter layer will react and send back a result. Also, you access the channel asynchronously. Hence, you can perform long operations with channels without slowing down your UI.

The MethodChannel below is the component used to enable this communication from the Flutter side:

const MethodChannel(
  String name, [
  MethodCodec codec = const StandardMethodCodec()

To start, you provide a name to the method channel. You can provide an optional codec that will be used to encode and decode the message. The StandardMethodCodec is used as a default, but there are other codecs to choose from. The specialized codecs are BinaryCodec, StringCodec, and JSONMessageCodec. You can also make your own codec.

Then, you will choose how to invoke the method. You can use one of the three invokation options below:

Future<T> invokeMethod <T>(
  String method, [
  dynamic arguments

Future<List<T>> invokeListMethod <T>(
  String method, [
  dynamic arguments

Future<Map<K, V>> invokeMapMethod <K, V>(
  String method, [
  dynamic arguments

For example, you can use invokeMethod to pass a message through the channel with a method name and optional arguments.

You also call setMethodCallHandler to provide a way to handle the incoming messages. Usually, you call this method on the receiving side of the channel. But since you can do two-way communication, you can in fact call this on either side.


There is another platform channel provided by Flutter, namely, the BasicMessageChannel, and it looks like this:

const BasicMessageChannel<T>(
  String name,
  MessageCodec<T> codec

As the name implies, this is a more basic option for asynchronously passing messages using a custom codec.

In contrast to method channels in which you pass a method and parameters, with the BasicMessageChannel, you simply call send with your desired message to pass to the platform. Note that you need to specify the codec for this channel; there is no default. Thus, the message can be anything as long as the channel can decode it:

Future<T> send(T message) async {
  return codec.decodeMessage(await BinaryMessages.send(name, codec.encodeMessage(message)));

This is how sending a message through a BasicMessageChannel works internally. Specifically, the channel’s codec encodes the message. Since the codec is customizable, you can pass almost anything here.

Writing Your Platform-Specific Code

After all that theory, you are now ready to build something! In the rest of the tutorial, you will:

  • Learn how to read photos from Android.
  • Replicate the above for iOS.
  • Open a channel from Dart.
  • Prepare the data that you will be passing through the channel.
  • On the receiving side, bring back that data into something you can use.

At that point, you will be able to see the image picker working.

For a challenge, you will get to know the concept of Flutter plugins. Plugins give you the ability to make this image picker into sharable code. Now, you should start by taking some photos, then go on to the next section.

Getting Images From the Gallery

The concepts of this platform-specific section are simple. First, you need to request access for photos. Then, you need to make a few methods. One method gets the total number of images; another is to return the image data, including extra data, for one image given an index. Time to start!

Accessing the Gallery in Android

First, open the starter project on Android Studio. Build and run the app. Set the device to Android. You should see something similar to the below.

Getting the Total Image Count on Android

Open the Kotlin file MainActivity.kt, which you find can within the android/app/src/main/kotlin path. In the onCreate method, notice that there is already a request for permissions. This is to read images located in your external storage. Next, put the following inside getGalleryImageCount(), replacing the existing content:

val uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI
// 1
val cursor = contentResolver.query(uri, columns, null, null, null);
// 2
return cursor?.count ?: 0

This returns the total number of images that are in your external storage. Going over each line:

  1. Here, you open a cursor using the columns provided. The cursor is pointed at the media content in your external storage.
  2. You return the total number of items in that cursor. You are also handling the case of the cursor being null. In that case, the count will be zero.

So now you return the total number of images in a method. Next, you should fill in the details for getting image data.

Getting Image Data on Android

Next, update dataForGalleryItem() to the following, being sure to add the imports shown:


private fun dataForGalleryItem(index: Int, completion: (ByteArray, String, Int, String)
  -> Unit) {
  val uri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI
  val orderBy = MediaStore.Images.Media.DATE_TAKEN
  // 1
  val cursor = contentResolver.query(uri, columns, null, null, "$orderBy DESC")
  cursor?.apply {
    // 2

    // 3
    val idIndex = getColumnIndexOrThrow(MediaStore.Images.Media._ID)
    val createdIndex = getColumnIndexOrThrow(MediaStore.Images.Media.DATE_ADDED)
    val latitudeIndex = getColumnIndexOrThrow(MediaStore.Images.Media.LATITUDE)
    val longitudeIndex = getColumnIndexOrThrow(MediaStore.Images.Media.LONGITUDE)
    val id = getString(idIndex)
    // 4
    val bmp = MediaStore.Images.Thumbnails.getThumbnail(contentResolver, id.toLong(), MediaStore.Images.Thumbnails.MINI_KIND, null)
    val stream = ByteArrayOutputStream()
    bmp.compress(Bitmap.CompressFormat.JPEG, 90, stream)
    val data = stream.toByteArray()
    // 5
    val created = getInt(createdIndex)
    val latitude = getDouble(latitudeIndex)
    val longitude = getDouble(longitudeIndex)
    // 6
    completion(data, id, created, "$latitude, $longitude")

Going over each in turn:

  1. Here, you open a cursor again with the columns from earlier. You also request sorted items descending with the date taken.
  2. This moves the cursor to that index.
  3. Then, you get each column item’s column index and the image id.
  4. This block of code gets a thumbnail of the image and converts the bitmap to a byte array.
  5. Here, you get the corresponding data for each column item. Depending on the data type of that column, use an appropriate method.
  6. Finally, hand all the data that you got to a completion function.

Build and run the project — but make sure you have photos in your gallery, otherwise, you will get an error. Then, you should check Run panel to see something similar to the following:

2019-04-28 20:30:09.695 9033-9033/com.raywenderlich.imagepickerflutter I/System.out: number of items 2
2019-04-28 20:30:10.669 9033-9033/com.raywenderlich.imagepickerflutter I/System.out: first item [B@86487e3 12 1556476198 0.0, 0.0

This proves that you have the total count of your images. This also shows that you were able to get the first image, along with the timestamp and coordinates.

If you get an error saying something like the following, add a photo to your gallery, e.g., take a selfie:

android.database.CursorIndexOutOfBoundsException: Index 0 requested, with a size of 0

You can also modify the code so that it doesn’t try to fetch an image when there are none.

The Android Kotlin code is setup, so now you are ready to move on to the iOS-specific code.

Getting the List of Images on iOS

You can still use Android Studio here to edit the project Swift code, but make sure that you are targeting an iOS simulator or device. See the following screenshot to confirm that you are setting it properly:

Build and run the project while targeting an iOS simulator or device. You should see something like the image below:

Similar to the previous section, you will need to make methods to return the total count and image data.

Getting the Total Image Count on iOS

Still in Android Studio, or in Xcode if you prefer, open AppDelegate.swift, which you can find in iOS/Runner. Then add the following code, replacing the placeholder for getGalleryImageCount():

import Photos

func getGalleryImageCount() -> Int {
  // 1
  let fetchOptions = PHFetchOptions()
  fetchOptions.includeHiddenAssets = true

  // 2
  let collection: PHFetchResult = PHAsset.fetchAssets(with: fetchOptions)
  // 3
  return collection.count

Here, you are doing the following:

  1. Declare fetchOptions that includes hidden assets. This parameter is useful to show the images from the simulator.
  2. Query fetch assets using the options from the previous step.
  3. Return the total count of assets.

Next, you need to get the image data for a given index.

Getting Image Data on iOS

Like for Android, replace the placeholder for dataForGalleryItem(), this time using the following Swift code:

func dataForGalleryItem(index: Int, completion: @escaping (Data?, String, Int, String) -> Void) {
  // 1
  let fetchOptions = PHFetchOptions()
  fetchOptions.includeHiddenAssets = true
  let collection: PHFetchResult = PHAsset.fetchAssets(with: fetchOptions)
  if (index >= collection.count) {

  // 2
  let asset = collection.object(at: index)

  // 3
  let options = PHImageRequestOptions()
  options.deliveryMode = .fastFormat
  options.isSynchronous = true
  let imageSize = CGSize(width: 250,
                         height: 250)

  // 4
  let imageManager = PHCachingImageManager()
  imageManager.requestImage(for: asset, targetSize: imageSize, contentMode: .aspectFit, options: options) { (image, info) in
    // 5
    if let image = image {
      // 6
      let data = UIImageJPEGRepresentation(image, 0.9)
                 Int(asset.creationDate?.timeIntervalSince1970 ?? 0),
                 "\(asset.location ?? CLLocation())")
    } else {
      completion(nil, "", 0, "")

Going over each one by one:

  1. You open a query for the assets similar to the previous method. You also stop getting data if the index is beyond the number of images.
  2. You get the asset at the given index.
  3. You declare the request options to get the image synchronously. Also, using the fast way (not high quality), and with that image size.
  4. You request the image.
  5. If the image exists, process it, otherwise return nil.
  6. Convert the UIImage into data and pass it to the completion closure. Also, include the image identifier, creation date and location.

Once you’ve made these changes, build and run your project. It’s possible you may need to restart your app a second time after accepting the prompt to allow access to photos. You should then see the same interface but have a similar output to the below in your logs. Make sure you are looking at the Run panel of Android Studio or the console in Xcode:

image count: 9
first data: 41540 bytes 106E99A1-4F6A-45A2-B320-B0AD4A8E8473/L0/001 1299975445 <+38.03744450,-122.80317833> +/- 0.00m (speed 0.00 mps / course 0.00) @ 1/1/01, 1:00:00 AM Central European Standard Time

This shows you have successfully read the image count and image data. Hooray!

Now you can finally move to setting up the platform channels so that you can pass all this image information to Flutter.

Setting Up Platform Channels

To communicate between platform code and Flutter, you need to open a channel on both sides. Here, you will use a method channel.

Platform Channels in Flutter

In Android Studio, open lib/MultiGallerySelectPage.dart. Add this code inside the _MultiGallerySelectPageState class:

// import should be on top
import 'package:flutter/services.dart';

// This should be inside the state class
final _channel = MethodChannel("/gallery");

This opens a channel with the specified name, /gallery, from Flutter. Then you should try to invoke a method in the channel so that you can see the real image count from your gallery. Add the following code in initState():

_channel.invokeMethod<int>("getItemCount").then((count) => setState(() {
  _numberOfItems = count;

This calls a getItemCount method in the channel. When it receives the response, it will set the variable _numberOfItems using setState. With setState, the Flutter UI is redrawn/refreshed.

You should then open a channel with the same name in Android and iOS.

Platform Channels in Android

For Android, open MainActivity.kt. Then add the following just after the call to GeneratedPluginRegistrant.registerWith(this) within the onCreate function, and be sure to add the import shown to the top of the file:

import io.flutter.plugin.common.MethodChannel

// 1
val channel = MethodChannel(flutterView, "/gallery")

// 2
channel.setMethodCallHandler { call, result ->
  when (call.method) {
    // 3
    "getItemCount" -> result.success(getGalleryImageCount())
    else -> println("unhandled")

Here, you did the following:

  1. Open a channel with the same name that you previously used (/gallery).
  2. Set a handler for the channel.
  3. If the method is named getItemCount, then provide the image count to the success function.

You will now do the same thing for iOS.

Platform Channels in iOS

Open AppDelegate.swift and add the following just after GeneratedPluginRegistrant.register(with: self):

// 1
guard let controller = window?.rootViewController as? FlutterViewController else {
  fatalError("rootViewController is not type FlutterViewController")

// 2
let channel = FlutterMethodChannel(name: "/gallery", binaryMessenger: controller)

// 3
channel.setMethodCallHandler { (call, result) in
  switch (call.method) {
  // 4
    case "getItemCount": result(self.getGalleryImageCount())
    default: result(FlutterError(code: "0", message: nil, details: nil))

Similar to Android, you do the following:

  1. Get the view controller instance so that you can open a channel.
  2. Open a channel with the same name that you previously used (/gallery).
  3. Set a handler for the channel.
  4. If the method is named getItemCount, then provide the image count to the result function. Also, there is a default switch handler for completeness.

At this point, you should be able to see the actual count of images in each platform. Build and run the project and you should see this result in the app.

Encoding the Gallery Data for Flutter

You already are providing the total image count to Flutter. Now it’s time to provide the actual image data.

In Android Studio, open MainActivity.kt. Then, insert the following inside the when (call.method) { block. It can come before or after getItemCount:

// 1
"getItem" -> {
  // 2
  val index = (call.arguments as? Int) ?: 0
  // 3
  dataForGalleryItem(index) { data, id, created, location ->
    // 4
    result.success(mapOf<String, Any>(
        "data" to data,
        "id" to id,
        "created" to created,
        "location" to location

Going over each, in turn:

  1. Here, you add a case to when. You perform this case when the call method is getItem.
  2. Parse the call arguments. You are assuming that the argument is an integer.
  3. Get the associated data for this index. Call the dataForGalleryItem method that you made earlier.
  4. Provide a map containing the keys and associated values to the success function.

Next, you will do the same for iOS. Open AppDelegate.swift and insert these lines into the switch (call.method) section. It can be placed before or after the case for getItemCount:

// 1
case "getItem":
  // 2
  let index = call.arguments as? Int ?? 0
  // 3
  self.dataForGalleryItem(index: index, completion: { (data, id, created, location) in
    // 4
         "data": data ?? Data(),
         "id": id,
         "created": created,
         "location": location

You may have noticed that this is very similar to the previous code, except this time for iOS and Swift. The description for each number is the same.

There! You have encoded the data for the channel! :]

This section has no visual difference from the previous one. Build and run just to make sure the project compiles. You will see the result after the next section.

Decoding the Gallery Data in Flutter

You have already prepared the getItem handler in the platforms. Now, you can call it from Flutter.

First, open MultiGallerySelectPage.dart and update the code in the _getItem() placeholder to the following:

// 1
if (_itemCache[index] != null) {
  return _itemCache[index];
} else {
  // 2
  var channelResponse = await _channel.invokeMethod("getItem", index);
  // 3
  var item = Map<String, dynamic>.from(channelResponse);

  // 4
  var galleryImage = GalleryImage(
      bytes: item['data'],
      id: item['id'],
      dateCreated: item['created'],
      location: item['location']);

  // 5
  _itemCache[index] = galleryImage;

  // 6
  return galleryImage;

This achieves the following:

  1. Checks the itemCache for entries on the same index. Return it from cache if it exists.
  2. If not, you invoke the getItem method on the channel. You also pass the index as an argument.
  3. You convert the unstructured response into a map. You know the format of the data that you will receive from the previous section.Dynamic is the data type used for because the values are different. They are either strings or int.
  4. Put each value from the Map into a GalleryImage.
  5. Put the image data into the cache.
  6. Finally, return the GalleryImage instance.

At this point, you should be able to see the images from your gallery. Build and run the project on both Android and iOS. You should see your images, and you should be able to select multiple images.

It’s a good thing our image picker can select multiple images — it’s hard to pick just one of those cat memes!

Challenge: Turn Your Project Into a Plugin

Now, you have a working image picker. You might wonder how to share your awesome code with the community. There is a better way to package this code that includes the platform-specific parts: Flutter Plugins.

You might actually be using plugins already. An example is the URL launcher plugin. It allows you to open URLs from Flutter. This plugin has platform code inside. Another example is the battery plugin. This obviously needs platform code to read the battery level.

To convert your project into a plugin, you need to make a Flutter interface that handles the channels. The user of the plugin only needs to know about this interface. How it works inside with channels should be abstracted.

You can read more about plugins here.

Where to Go From Here?

You can download the completed project files by clicking on the Download Materials button at the top or bottom of the tutorial.

You may notice that, if you do a fresh install of the final project on Android, you will not be able to see images the first time you run the app. This is issue is outside the scope of this tutorial. However, it might be a nice challenge to tackle next.

If you’re interested, check out the official documentation for platform channels, here.

There is also a good guide on how to make effective plugins here.

We hope you enjoyed this tutorial! If you have any questions or comments, please join the forum discussion below.

Average Rating


Add a rating for this content

2 ratings