In this codelab, you'll learn how to quickly enable a mobile app for Android TV using the Leanback library. At the end of the codelab you can expect to have a UX compliant single apk for mobile devices and Android TV.
To start off let's learn a little bit about Android TV. What is Android TV and how is it different? At its core, it is Android so most of the things that you've learned developing your mobile app can be reused. The key difference is input and the presentation of information.
Android TV is designed for the 10 foot experience. Instead of using a touchscreen, users will be navigating your app using a controller while sitting far away from the screen. Instead of swiping the notification bar down, the notifications will be displayed as the top row of cards. And, the screen is always filled with rich visual content.
In an effort to simplify integration for developers we created the Leanback library. Leanback has extendable fragments to allow you to quickly and easily create rich animated experiences. The core fragments we'll be working with are:
BrowseFragment
- Browse through a video libraryDetailsFragment
- Display the details of a specific videoPlaybackOverlayFragment
- Control video playbackThese fragments use the Model View Presenter pattern. You'll bind your data model to the view using presenter classes.
There's a lot of ground to cover, so let's get started!
This codelab uses Android Studio, an IDE for developing Android apps.
If you don't have it installed yet, please download and install it.
The first thing we need to do is get the mobile app to build on. Open up your Terminal and run:
git clone https://github.com/googlecodelabs/android-tv-leanback.git
Open Android Studio and click File > Open from the menu bar or Open an Existing Android Studio Project from the splash-screen and select the recently cloned folder.
checkpoint_0 is the base app that we'll be building upon.
You will be adding code from each step to checkpoint_0.
Each of the following checkpoints can be used as reference points to check your work or for help if you encounter any issues. The checkpoint number corresponds with the codelab step - 1. (0 vs 1 indexing)
A brief overview of each of the components:
MainActivity
- Video browserPlayerActivity
- Video playerSlidingTabLayout
, SlidingTabStrip
, VideoItemFragment
- UI for video browserVideo
- Object storing video infoVideoContentProvider
, AbstractVideoItemProvider
, VideoItemContract
, VideoDataManager
- Mock local video databaseLet's run it on a phone.
Here's what it should look like:
Now let's see how it looks on Android TV.
First we need to connect to the Android TV device. In order to that you can use a male to male USB cable or adb connect.
Let's enable developer mode if it's not yet enabled.
If you have a USB cable, connect your Android TV device directly to your machine. You should now be ready to run the app. You can test if it worked by running the following on your development machine:
adb devices
If your device is listed, you're now in business and can skip to running the app! If that doesn't work, try the steps below.
Find the IP of the device.
Once you have the IP of the device, you can connect to it using adb connect in a terminal.
adb connect [ip address]:4321
Great, we're now connected!
Let's run the app on Android TV. In Android Studio select checkpoint_0 and click run again. This time the Android TV device should appear in the list of running devices. Select it and click ok.
Now the mobile app is running on Android TV. It could use some TV UI love right? In the next few steps we'll cover adding some Android TV UI to the existing data sources and video player.
In this step you've learned about:
Let's start creating the video browsing experience.
In this step we'll put up the framework for the video browser fragment. The key concepts to take away in this step are:
BrowseFragment
Let's get started.
Open the build.gradle (checkpoint_0) file.
Under dependencies, notice that the following dependencies exist:
compile 'com.android.support:appcompat-v7:23.3.0'
compile 'com.android.support:leanback-v17:23.3.0'
compile 'com.android.support:recyclerview-v7:23.3.0'
There are a few things to note here. The Leanback libraries that target API version 23 are backwards compatible to API version 17. For apps that require support for previous versions of Android, you should make sure that the code path utilizing libraries with higher minSdk
does not run on devices with version < minSdk
(library).
Next, let's take create a browsing activity.
In the Android Studio project explorer create a new package under com.android.example.leanback
by right-clicking the folder and clicking New -> Package.
Name the new package fastlane
.
Under fastlane
right-click to create a new Blank Activity called LeanbackActivity
, and click on Finish.
Once the class is created, delete the menu/menu_leanback.xml
resource since it won't be used. In theLeanbackActivity
class, delete the onCreateOptionsMenu
and onOptionsItemSelected
functions. Also set the LeanbackActivity
class to extend Activity
instead of ActionBarActivity
.
Open up AndroidManifest.xml
First, declare that we want to use leanback. As a child of manifest
add the following line:
<uses-feature android:name="android.software.leanback"
android:required="false" />
Add an intent filter to the LeanbackActivity
tag. android.intent.category.LEANBACK_LAUNCHER
tells Android TV to launch LeanbackActivity
when the application is run.
<activity
android:name=".fastlane.LeanbackActivity"
android:label="@string/title_activity_player"
android:theme="@style/AppTheme">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LEANBACK_LAUNCHER" />
</intent-filter>
</activity>
We are also adding a theme to the activity. In the next step we will create a values-television directory and create values specific to television.
Since certain features are not available on TV, you need to define their requirement as optional. If you use any of the following features you'll need to add android:required="false"
to the definition.
Hardware | Android feature descriptor |
Touchscreen |
|
Telephony |
|
Camera |
|
Near Field Communications (NFC) |
|
GPS |
|
Microphone |
|
<uses-feature android:name="android.hardware.touchscreen"
android:required="false" />
We want to create some values that are specifically for Android TV.
→ Right click on the res
directory and create a new Android resource directory.
→ Create a new values resource directory add UI mode and select Television as the qualifier.
→ Right click on the res
directory and create a new value resource file under the newly created values-television. Name the resources file styles.xml
.
→ Add the following styles which inherit from the Leanback theme to customize the look and feel.
<style name="AppTheme" parent="Theme.Leanback">
<item name="colorPrimary">@color/primary</item>
<item name="colorAccent">@color/accent</item>
<item name="colorPrimaryDark">@color/primary_dark</item>
<item name="imageCardViewStyle">@style/MyImageCardViewStyle</item>
<item name="headerStyle">@style/MyHeaderStyle</item>
<item name="rowHeaderStyle">@style/MyHeaderStyle</item>
</style>
<style name="MyImageCardViewStyle" parent="@style/Widget.Leanback.ImageCardViewStyle">
<item name="cardType">infoUnderWithExtra</item>
<item name="infoAreaBackground">@color/primary_dark</item>
</style>
<style name="MyHeaderStyle" parent="@style/Widget.Leanback.Header" >
<item name="android:textAppearance">@style/MyHeaderStyle.MyHeaderText</item>
</style>
<style name="MyHeaderStyle.MyHeaderText" parent="TextAppearance.Leanback.Header">
<item name="android:textSize">@dimen/lb_browse_header_text_size</item>
<item name="android:textColor">@color/accent</item>
<item name="android:textAllCaps">true</item>
<item name="android:textStyle">bold</item>
</style>
We've included Leanback libraries, and now Android TV will launch into the correct activity. Let's create the video browser.
We'll leverage the Leanback BrowseFragment
. The BrowseFragment
class in the Leanback library allows you to create a primary layout for browsing categories and rows of media items with a minimum amount of code.
The first step is to create a class that extends BrowseFragment
.
→ Under fastlane create a new empty class called LeanbackBrowseFragment
that extends BrowseFragment
.
Next let's fill out the class a little bit.
→ To the class, add a private member ArrayObjectAdapter
. We'll get into details about ArrayObjectAdapter
in the next step.
private ArrayObjectAdapter mRowsAdapter;
→ We'll also add a helper function to initialize the fragment. In it we instantiate mRowsAdapter
, set it as the Adapter for the fragment then set our main color and badge which appears in the top right of the browse view.
public void init() {
mRowsAdapter = new ArrayObjectAdapter(new ListRowPresenter());
setAdapter(mRowsAdapter);
setBrandColor(ContextCompat.getColor(getContext(), R.color.primary));
setBadgeDrawable(ContextCompat.getDrawable(getContext(), R.drawable.filmi));
}
→ Override the onViewCreated
method and call init
.
@Override
public void onViewCreated(View view, Bundle savedInstanceState) {
super.onViewCreated(view, savedInstanceState);
init();
}
Alright, onto the final step of this section, adding this fragment to the activity.
Open up activity_leanback.xml under layout and delete everything. The only thing we'll display in this activity is our LeanbackBrowseFragment
fragment. So add the following:
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/main_frame"
android:layout_width="match_parent"
android:layout_height="match_parent">
<fragment
android:name="com.android.example.leanback.fastlane.LeanbackBrowseFragment"
android:id="@+id/browse_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</FrameLayout>
Congrats! The framework for the browse fragment is in place. The checkpoint_1 directory contains all the changes described above in case you got stuck or want to compare notes.
In this step you've learned about:
BrowseFragment
Let's finish the browse fragment.
In this step we'll learn about how the Leanback BrowseFragment
works and put some content into it.
Let's get started.
First, let's cover how the BrowseFragment
works. The BrowseFragment
basically renders rows of data that you provide.
Think of each row as two pieces, a HeaderItem
which defines the category and an array of objects represented by the ListRow
class which defines the content.
The ArrayObjectAdapter
is an array of the defined ListRows
that aggregates the rows for the BrowseFragment
view.
We can store any sort of View
in ListRows
, but in our app we'll use the Leanback ImageCardView
. The zoom and additional detail effects are automatically handled by the Leanback library.
To tie your video data and the ImageCardView
together, we use a Presenter
. The Presenter
defines which elements of the view are populated from which elements of the model.
Lastly we have the ViewHolder
which is a container for the created view.
Let's put all of these concepts together to create the video browsing experience.
We need to create a presenter to tie our Video
model to the ImageCardView
.
→ Under fastlane create a new class called CardPresenter
that extends Presenter
.
→ Define class variables to store the desired ImageCardView
height and width and the application context.
private static int CARD_WIDTH = 200;
private static int CARD_HEIGHT = 200;
private static Context mContext;
→ Implement the currently-empty abstract methods onCreateViewHolder
, onBindViewHolder
, and onUnbindViewHolder
in the CardPresenter
class.
@Override
public ViewHolder onCreateViewHolder(ViewGroup viewGroup) {
return null;
}
@Override
public void onBindViewHolder(Presenter.ViewHolder viewHolder, Object o) {
}
@Override
public void onUnbindViewHolder(Presenter.ViewHolder viewHolder) {
}
We're leveraging Picasso, an open source library that simplifies image loading, caching, and resizing.
The base sample app already uses Picasso so this should be done for you. But if you want to add it in a separate app, in your build.gradle file add the following dependency.
compile 'com.squareup.picasso:picasso:2.3.4'
→ Create an inner static class PicassoImageCardViewTarget
implementing com.squareup.picasso.Target
and implement the methods: onBitmapLoaded
, onBitmapFailed,
and onPrepareLoad
.
static class PicassoImageCardViewTarget implements Target {
@Override
public void onBitmapLoaded(Bitmap bitmap, Picasso.LoadedFrom from) {
}
@Override
public void onBitmapFailed(Drawable errorDrawable) {
}
@Override
public void onPrepareLoad(Drawable placeHolderDrawable) {
}
}
→ Add a variable to store the ImageCardView
we'll draw into once the bitmap is loaded.
private ImageCardView mImageCardView;
→ Create a constructor with the target ImageCardView
as the parameter and store it as the instance's mImageCardView
.
public PicassoImageCardViewTarget(ImageCardView mImageCardView) {
this.mImageCardView = mImageCardView;
}
→ In onBitmapLoaded
, create a new Drawable
from the bitmap and set it as the main image for the ImageCardView
.
Drawable bitmapDrawable = new BitmapDrawable(mContext.getResources(), bitmap);
mImageCardView.setMainImage(bitmapDrawable);
→ In onBitmapFailed
, set the ImageCardView
image to the error default.
mImageCardView.setMainImage(errorDrawable);
We'll use a ViewHolder
to store all of the data associated with the view.
→ As a child of CardPresenter
, create an inner static class that extends Presenter.ViewHolder
and create the default constructor.
static class ViewHolder extends Presenter.ViewHolder {
public ViewHolder(View view) {
super(view);
}
}
→ Define class variables to store the ImageCardView
, Drawable
, and PicassoImageCardViewTarget
.
private ImageCardView mCardView;
private Drawable mDefaultCardImage;
private PicassoImageCardViewTarget mImageCardViewTarget;
→ In the constructor, cast the view parameter as an ImageCardView
and store it in mCardView
. Instantiate a new PicassoImageCardViewTarget
passing the cardView as the target parameter. Finally, get the default card image from resources.
public ViewHolder(View view) {
super(view);
mCardView = (ImageCardView) view;
mImageCardViewTarget = new PicassoImageCardViewTarget(mCardView);
mDefaultCardImage = mContext
.getResources()
.getDrawable(R.drawable.filmi);
}
→ Add a getter for mCardView
.
public ImageCardView getCardView() {
return mCardView;
}
→ Create a function that loads the image from a String URL.
protected void updateCardViewImage(String url) {
Picasso.with(mContext)
.load(url)
.resize(CARD_WIDTH * 2, CARD_HEIGHT * 2)
.centerCrop()
.error(mDefaultCardImage)
.into(mImageCardViewTarget);
}
Now let's create the ImageCardView
and bind it with some data from the model.
onCreateViewHolder
is called to create a new view. In it we'll handle the logic of storing the context, and creating a new ImageCardView
.
@Override
public ViewHolder onCreateViewHolder(ViewGroup viewGroup) {
Log.d("onCreateViewHolder", "creating viewholder");
mContext = viewGroup.getContext();
ImageCardView cardView = new ImageCardView(mContext);
cardView.setFocusable(true);
cardView.setFocusableInTouchMode(true);
((TextView)cardView.findViewById(R.id.content_text)).setTextColor(Color.LTGRAY);
return new ViewHolder(cardView);
}
We set the cardView
Focusable
and FocusableInTouchMode
to true
to enable it to be selected when browsing through the rows of content. It's important to remember to set these fields to true when implementing Android TV for your app.
Finally we set the TextColor
of the ImageCardView
to light gray.
We define the data binding logic in onBindViewHolder
. We can cast the Object
that's being passed in as our Video
data, then set the title text, subtext / content text, and image dimensions. Finally we tell it to load the image with a thumbnail URL.
@Override
public void onBindViewHolder(Presenter.ViewHolder viewHolder, Object o) {
Video video = (Video) o;
((ViewHolder) viewHolder).mCardView.setTitleText(video.getTitle());
((ViewHolder) viewHolder).mCardView.setContentText(video.getDescription());
((ViewHolder) viewHolder).mCardView.setMainImageDimensions(CARD_WIDTH, CARD_HEIGHT);
((ViewHolder) viewHolder).updateCardViewImage(video.getThumbUrl());
}
Make sure to include the Video
class:
import com.android.example.leanback.data.Video;
And our CardPresenter
is complete. Let's fill out some ListRows
with our video content.
In the LeanbackBrowseFragment
let's create some sample categories. Here we're defining them as constants, but in a real app you would probably pull them from your database.
private static final String[] HEADERS = new String[]{
"Featured", "Popular", "Editor's choice"
};
In init
after we've set the badge drawable, we'll loop through the categories and create a row of content for each one.
In each row, we'll create an ObjectAdapter
to define how to render the content that we'll pull from our database. We'll load the videos, create a header, and finally instantiate a ListRow
with the header and video data and add it to mRowsAdapter
.
public void init() {
mRowsAdapter = new ArrayObjectAdapter(new ListRowPresenter());
setAdapter(mRowsAdapter);
setBrandColor(ContextCompat.getColor(getContext(), R.color.primary));
setBadgeDrawable(ContextCompat.getDrawable(getContext(), R.drawable.filmi));
for (int position = 0; position < HEADERS.length; position++) {
ObjectAdapter rowContents = new CursorObjectAdapter((new SinglePresenterSelector(new CardPresenter())));
VideoDataManager manager = new VideoDataManager(getActivity(),
getLoaderManager(),
VideoItemContract.VideoItem.buildDirUri(),
rowContents);
manager.startDataLoading();
HeaderItem headerItem = new HeaderItem(position, HEADERS[position]);
mRowsAdapter.add(new ListRow(headerItem, manager.getItemList()));
}
}
Here, we'll update our VideoDataManager
to manage the cursor for our ObjectAdapter
.
In data/VideoDataManager
add an ObjectAdapter
to the class VideoDataManager
.
private ObjectAdapter mItemList;
Next, add ObjectAdapter
as fourth parameter to VideoDataManager
constructor and store the ObjectAdapter
as mItemList
.
public VideoDataManager(Context mContext, LoaderManager mLoaderManager, Uri mRowUri, ObjectAdapter rowContents) {
mItemList = rowContents;
....
}
In VideoDataManager
set the LOADER_ID
to a random integer and replace the video instantiation with setting the mapper for mItemList
.
LOADER_ID = Double.valueOf(Math.random() * Integer.MAX_VALUE).intValue();
((CursorObjectAdapter)mItemList).setMapper(mMapper);
Create the a getter for the ObjectAdapter
.
public ObjectAdapter getItemList() {
return mItemList;
}
Update VideoItemMapper
to extend CursorMapper
.
public static class VideoItemMapper extends CursorMapper {
Update onLoadFinished
to set the cursor for mItemList
.
public void onLoadFinished(Loader<Cursor> cursorLoader, Cursor cursor) {
if (mItemList instanceof CursorObjectAdapter) {
((CursorObjectAdapter) mItemList).swapCursor(cursor);
}
}
Fill in onLoaderReset
to set the Cursor
as null
.
public void onLoaderReset(Loader<Cursor> cursorLoader) {
if (mItemList instanceof CursorObjectAdapter) {
((CursorObjectAdapter) mItemList).swapCursor(null);
}
}
Congrats! You've completed this step. Try running the app on Android TV.
You can modify the default Activity that's launched from Android Studio. Click Edit Configurations
Under Activity
change the radio button to launch LeanbackActivity
. You should see a screen similar to the one below.
In this step you've learned about:
BrowseFragment
and how you can populate it with your videosLet's create the video details activity.
The Leanback library includes classes for displaying additional information about a media item, such as a description or reviews, and for taking action on that item, such as purchasing it or playing its content. This lesson discusses how to create a presenter class for media item details, and how to extend the DetailsFragment
class to implement a details view for a media item when it is selected by a user.
In this step, you'll learn about:
Activity
.First let's cover some concepts about how the DetailsFragment
works. It functions very similarly to the BrowseFragment
.
The classes controlling DetailsOverviewRow
and Additional Row are defined in a ClassPresenterSelector
allowing for flexibility.
The FullWidthDetailsOverviewRowPresenter
contains the video image, information and available actions. The Additional Row can be used to add related content or other controls.
FullWidthDetailsOverviewRowPresenter
handles binding the video information to the DetailsOverviewRow
.
The Additional Row can be populated with a ListRow
just like the BrowseFragment
.
Alright, let's create a video detail view.
First, create a new Blank Activity to handle the detail fragment.
→ Create an additional Activity VideoDetailsActivity
under fastlane
. Name the layout file activity_leanback_details
.
→ Just like when you created the LeanbackActivity
, you'll need to delete the menu resources and menu related methods and set it to extend Activity
instead of ActionBarActivity
.
→ Under layout, open activity_leanback_details
. Replace the default layout with the following code. Here, we're specifying that the Activity
consists of a single fragment, VideoDetailsFragment
.
<?xml version="1.0" encoding="utf-8"?>
<fragment xmlns:android="http://schemas.android.com/apk/res/android"
android:name="com.android.example.leanback.fastlane.VideoDetailsFragment"
android:id="@+id/details_fragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
/>
Don't worry about the errors, we'll work on creating the Fragment
in an upcoming step.
Now that we have the activity framework, let's add the Leanback style to the activity declaration in the manifest.
We need to update the style of this Activity
to Leanback. We can re-use the AppTheme
we defined previously which inherits from Leanback.
<activity android:name=".fastlane.VideoDetailsActivity"
android:label="@string/title_activity_player"
android:theme="@style/AppTheme"
android:exported="true">
</activity>
Now that the activity is styled correctly, we need to create the VideoDetailsFragment
that we reference in the layout.
Under fastlane create a new class called VideoDetailsFragment
extending DetailsFragment
.
We'll define a few class variables to store the Video
information and some constants for image sizes and action ids.
private Video selectedVideo;
private static final int DETAIL_THUMB_WIDTH = 274;
private static final int DETAIL_THUMB_HEIGHT = 274;
private static final int ACTION_PLAY = 1;
private static final int ACTION_WATCH_LATER = 2;
In VideoDetailsFragment
, we need to do several things:
DetailsOverviewRow
to display video detailsListRow
for recommended itemsLet's start by getting the selected video from the intent. We'll override onCreate
and get the video from the intent.
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
selectedVideo = (Video) getActivity()
.getIntent()
.getSerializableExtra(Video.INTENT_EXTRA_VIDEO);
}
Before we get the image and create the DetailsOverviewRow
we need to define a Presenter
to bind the data. The Leanback framework provides the AbstractDetailsDescriptionPresenter
class for this purpose, a nearly complete implementation of the presenter for media item details.
→ Under fastlane
create a new class DetailsDescriptionPresenter
extending AbstractDetailsDescriptionPresenter
.
→ Override onBindDescription
. Cast the item as a Video
object then get and set the Title, Subtitle and Body.
@Override
protected void onBindDescription(ViewHolder viewHolder, Object o) {
Video video = (Video) o;
if (video != null) {
Log.d("Presenter", String.format("%s, %s, %s", video.getTitle(), video.getThumbUrl(), video.getDescription()));
viewHolder.getTitle().setText(video.getTitle());
viewHolder.getSubtitle().setText(String.format(mContext.getString(R.string.rating), video.getRating()));
viewHolder.getBody().setText(video.getDescription());
}
}
In order to not block the main UI thread, we create an AsyncTask
to load the thumbnail bitmap. In VideoDetailsFragment
create a DetailsRowBuilderTask
class extending AsyncTask
with Video
, Integer
, and DetailsOverviewRow
as the parameter, progress, and result respectively.
private class DetailRowBuilderTask extends AsyncTask<Video, Integer, DetailsOverviewRow> {
@Override
protected DetailsOverviewRow doInBackground(Video... videos) {
DetailsOverviewRow row = new DetailsOverviewRow(videos[0]);
Bitmap poster = null;
try {
// the Picasso library helps us dealing with images
poster = Picasso.with(getActivity())
.load(videos[0].getThumbUrl())
.resize(dpToPx(DETAIL_THUMB_WIDTH, getActivity().getApplicationContext()),
dpToPx(DETAIL_THUMB_HEIGHT, getActivity().getApplicationContext()))
.centerCrop()
.get();
} catch (IOException e) {
e.printStackTrace();
}
row.setImageBitmap(getActivity(), poster);
SparseArrayObjectAdapter adapter = new SparseArrayObjectAdapter();
adapter.set(ACTION_PLAY, new Action(ACTION_PLAY, getResources().getString(R.string.action_play)));
adapter.set(ACTION_WATCH_LATER, new Action(ACTION_WATCH_LATER, getResources().getString(R.string.action_watch_later)));
row.setActionsAdapter(adapter);
return row;
}
}
Here we're instantiating a new DetailsOverviewRow
passing in the current Video
as the main item for the details page. We create a holder Bitmap
variable to load the thumbnail. We use Picasso to load and resize the image and store in poster
.
Next we set poster
as the bitmap.
Finally, we specify the actions by creating a SparseArrayObjectAdapter
to hold our actions, and setting this as the DetailsOverviewRow
's action adapter. In this app, we have ACTION_PLAY
and ACTION_WATCH_LATER
, but you could define a purchase or rent action.
In strings.xml
define the text for action_play
and action_watch_later
.
<string name="action_play">PLAY</string>
<string name="action_watch_later">WATCH LATER</string>
To assist in calculating the appropriate screen size in DP, create a utility function.
public static int dpToPx(int dp, Context ctx) {
float density = ctx.getResources().getDisplayMetrics().density;
return Math.round((float) dp * density);
}
Now that the image has loaded, we can create the rest of the details fragment.
The Picasso library loads and and resizes the image off the UI thread. After it has completed we create the presenters in the onPostExecute
method of the AsyncTask
and set the adapter of the DetailsFragment
.
→ Override onPostExecute
.
@Override
protected void onPostExecute(DetailsOverviewRow detailRow) {
→ Instantiate a new ClassPresenterSelector
. This object allows you to define the presenters for each portion of DetailFragment
.
ClassPresenterSelector ps = new ClassPresenterSelector();
→ Instantiate a new FullWidthDetailsOverviewRowPresenter
passing in a new Instance of DetailsDescriptionPresenter
as a parameter.
FullWidthDetailsOverviewRowPresenter detailsPresenter = new FullWidthDetailsOverviewRowPresenter(new DetailsDescriptionPresenter(getContext()));
→ We can add a custom background color programmatically.
detailsPresenter.setBackgroundColor(ContextCompat.getColor(getContext(), R.color.primary));
detailsPresenter.setInitialState(FullWidthDetailsOverviewRowPresenter.STATE_FULL);
→ Add an onActionClickedListener
by creating a new OnActionClickedListener
and implementing onActionClicked
. In onActionClicked
, check the actionid
. If the action is ACTION_PLAY
we want to Intent
to the VideoPlayer
Activity passing the video details. Otherwise, we'll create a toast to display a String
defining the action.
detailsPresenter.setOnActionClickedListener(new OnActionClickedListener() {
@Override
public void onActionClicked(Action action) {
if (action.getId() == ACTION_PLAY) {
Intent intent = new Intent(getActivity(), PlayerActivity.class);
intent.putExtra(Video.INTENT_EXTRA_VIDEO, selectedVideo);
startActivity(intent);
} else {
Toast.makeText(getActivity(), action.toString(), Toast.LENGTH_SHORT).show();
}
}
});
→ Add the FullWidthDetailsOverviewRowPresenter
to ClassPresenterSelector
.
ps.addClassPresenter(DetailsOverviewRow.class, detailsPresenter);
→ Instantiate a new ArrayObjectAdapter
passing in the ClassPresenterSelector
. Then, add the DetailRow
to the ArrayObjectAdapter
.
ArrayObjectAdapter adapter = new ArrayObjectAdapter(ps);
adapter.add(detailRow);
→ Finally, set the adapter.
setAdapter(adapter);
When we're done, the onPostExecute
method should look like the following:
@Override
protected void onPostExecute(DetailsOverviewRow detailRow) {
ClassPresenterSelector ps = new ClassPresenterSelector();
FullWidthDetailsOverviewRowPresenter detailsPresenter = new FullWidthDetailsOverviewRowPresenter(new DetailsDescriptionPresenter(getContext()));
// Add some style
detailsPresenter.setBackgroundColor(ContextCompat.getColor(getContext(), R.color.primary));
detailsPresenter.setInitialState(FullWidthDetailsOverviewRowPresenter.STATE_FULL);
// we listen to two different actions: play and show
detailsPresenter.setOnActionClickedListener(new OnActionClickedListener() {
@Override
public void onActionClicked(Action action) {
if (action.getId() == ACTION_PLAY) {
Intent intent = new Intent(getActivity(), PlayerActivity.class);
intent.putExtra(Video.INTENT_EXTRA_VIDEO, selectedVideo);
startActivity(intent);
} else {
Toast.makeText(getActivity(), action.toString(), Toast.LENGTH_SHORT).show();
}
}
});
ps.addClassPresenter(DetailsOverviewRow.class, detailsPresenter);
ArrayObjectAdapter adapter = new ArrayObjectAdapter(ps);
adapter.add(detailRow);
// finally we set the adapter of the DetailsFragment
setAdapter(adapter);
}
As an additional step, add a row of related videos below the detail card. To do this we only have to add another presenter and adapter to the onPostExecute
method of the AsyncTask
First, let's add an additional presenter:
ps.addClassPresenter(ListRow.class, new ListRowPresenter());
Accordingly, the ArrayObjectAdapter
requires an additional ListRow
to be added. We create a new ListRow
to which we pass a HeaderItem
and a CursorObjectAdapter
in the constructor just like we did for the BrowseFragment
.
String subcategories[] = {
"You may also like"
};
CursorObjectAdapter rowAdapter = new CursorObjectAdapter(
new SinglePresenterSelector(new CardPresenter()));
VideoDataManager manager = new VideoDataManager(getActivity(),getLoaderManager(),
VideoItemContract.VideoItem.buildDirUri(),rowAdapter);
manager.startDataLoading();
HeaderItem header = new HeaderItem(0, subcategories[0]);
adapter.add(new ListRow(header, rowAdapter));
Now we instantiate and execute the DetailRowBuilderTask
in the onCreate
method of the VideoDetailsFragment
.
new DetailRowBuilderTask().execute(selectedVideo);
Now that we've completed the DetailsFragment
, we need to modify the browse fragment to send an intent to the details view when a user clicks on a media item. In order to enable this behavior, add an OnItemViewClickedListener
object to LeanbackBrowseFragment
that fires an intent to start VideoDetailsActivity
.
In init
set the onItemViewClickedListener
. Here we're using a helper function to generate the onItemViewClickedListener
.
public class LeanbackBrowseFragment extends BrowseFragment {
...
public void init() {
...
setOnItemViewClickedListener(getDefaultItemViewClickedListener());
...
}
...
}
Create the helper function getDefaultItemViewClickerListener
which returns a new OnItemViewClickedListener
.
private OnItemViewClickedListener getDefaultItemViewClickedListener() {
return new OnItemViewClickedListener() {
@Override
public void onItemClicked(Presenter.ViewHolder viewHolder, Object o,
RowPresenter.ViewHolder viewHolder2, Row row) {
Intent intent = new Intent(getActivity(), VideoDetailsActivity.class);
intent.putExtra(Video.INTENT_EXTRA_VIDEO, (Serializable)o);
startActivity(intent);
}
};
}
As we now pass the Video
object with an Intent, we use a key Video.INTENT_EXTRA_VIDEO
for it.
Congrats, you've finished this step! Compile, run and watch how you can play videos now!
In this step you've learned about:
DetailsFragment
Creating recommendations that display on the home screen.
When interacting with TVs, users generally prefer to give minimal input before watching content. An ideal scenario for many TV users is: sit down, turn on, and watch. The fewest steps to get users to the content they enjoy is generally the path they prefer.
Content recommendations appear as the first row of the TV launch screen after the first use of the device. Contributing recommendations from your app's content catalog can help bring users back to your app.
In this step, you'll learn how to create recommendations and provide them to the Android framework so your app content can be easily discovered and enjoyed by users.
In this step, you'll learn about:
Content recommendations are created with background processing. In order for your application to contribute to recommendations, create a service that periodically adds listings from your app's catalog to the system list of recommendations.
→ Under fastlane
create a new class RecommendationsService
extending IntentService
.
public class RecommendationsService extends IntentService
→ We'll define a few constants for rendering and tagging, and define a NotificationManager
private static final String TAG = "RecommendationsService";
private static final int MAX_RECOMMENDATIONS = 3;
public static final String EXTRA_BACKGROUND_IMAGE_URL = "background_image_url";
private static final int DETAIL_THUMB_WIDTH = 274;
private static final int DETAIL_THUMB_HEIGHT = 274;
private NotificationManager mNotificationManager;
→ Create the default constructor.
public RecommendationsService() {
super("RecommendationsService");
}
→ Next override the onHandleIntent
function. As an example recommendation service, we'll use the same video selections as the browse fragment. We store a ContentProviderClient
, then create a Cursor
from the client.
@Override
protected void onHandleIntent(Intent intent) {
ContentProviderClient client = getContentResolver()
.acquireContentProviderClient(VideoItemContract.VideoItem.buildDirUri());
try {
Cursor cursor = client.query(VideoItemContract.VideoItem.buildDirUri(),
VideoDataManager.PROJECTION,
null,
null,
VideoItemContract.VideoItem.DEFAULT_SORT);
→ Instantiate a VideoItemMapper
that we've defined in VideoDataManager
and map it to cursor with bindColumns
.
VideoDataManager.VideoItemMapper mapper = new VideoDataManager.VideoItemMapper();
mapper.bindColumns(cursor);
→ Instantiate a NotificationManager
.
mNotificationManager = (NotificationManager) getApplicationContext()
.getSystemService(Context.NOTIFICATION_SERVICE);
→ Create a counter for the iteration to create recommendations up to MAX_RECOMMENDATIONS
.
int count = 1;
→ Loop through the cursor until we're out of recommendations or we've hit our max, and create pending intents for each. The pending intents will direct the user to the details view of the video.
while (cursor.moveToNext() && count <= MAX_RECOMMENDATIONS) {
Video video = mapper.bind(cursor);
PendingIntent pendingIntent = buildPendingIntent(video);
Bundle extras = new Bundle();
extras.putString(EXTRA_BACKGROUND_IMAGE_URL, video.getThumbUrl());
count++;
}
→ Create a utility function to create the PendingIntent
from Video
.
private PendingIntent buildPendingIntent(Video video) {
Intent detailsIntent = new Intent(this, PlayerActivity.class);
detailsIntent.putExtra(Video.INTENT_EXTRA_VIDEO, video);
TaskStackBuilder stackBuilder = TaskStackBuilder.create(this);
stackBuilder.addParentStack(VideoDetailsActivity.class);
stackBuilder.addNextIntent(detailsIntent);
// Ensure a unique PendingIntents, otherwise all recommendations end up with the same
// PendingIntent
detailsIntent.setAction(Long.toString(video.getId()));
PendingIntent intent = stackBuilder.getPendingIntent(0, PendingIntent.FLAG_UPDATE_CURRENT);
return intent;
}
→ Finally close the cursor and catch potential errors and you should have something similar to the code below.
@Override
protected void onHandleIntent(Intent intent) {
ContentProviderClient client = getContentResolver().acquireContentProviderClient(VideoItemContract.VideoItem.buildDirUri());
try {
Cursor cursor = client.query(VideoItemContract.VideoItem.buildDirUri(), VideoDataManager.PROJECTION, null, null, VideoItemContract.VideoItem.DEFAULT_SORT);
VideoDataManager.VideoItemMapper mapper = new VideoDataManager.VideoItemMapper();
mapper.bindColumns(cursor);
mNotificationManager = (NotificationManager) getApplicationContext()
.getSystemService(Context.NOTIFICATION_SERVICE);
int count = 1;
while (cursor.moveToNext() && count <= MAX_RECOMMENDATIONS) {
Video video = mapper.bind(cursor);
PendingIntent pendingIntent = buildPendingIntent(video);
Bundle extras = new Bundle();
extras.putString(EXTRA_BACKGROUND_IMAGE_URL, video.getThumbUrl());
count++;
}
cursor.close();
} catch (RemoteException re) {
} catch (IOException re) {
} finally {
mNotificationManager = null;
}
}
Once the recommended videos are loaded, the service must create recommendations and pass them to the Android framework. The framework receives the recommendations as Notification objects that use a specific template and are marked with a specific category.
The following code example demonstrates how to get an instance of the NotificationManager
, build a recommendation, and pass it to the manager. This code needs to be added in the while loop after the PendingIntent
has been created.
Bitmap image = Picasso.with(getApplicationContext())
.load(video.getThumbUrl())
.resize(VideoDetailsFragment.dpToPx(DETAIL_THUMB_WIDTH, getApplicationContext()), VideoDetailsFragment.dpToPx(DETAIL_THUMB_WIDTH, getApplicationContext()))
.get();
Notification notification = new NotificationCompat.BigPictureStyle(
new NotificationCompat.Builder(getApplicationContext())
.setContentTitle(video.getTitle())
.setContentText(video.getDescription())
.setPriority(4)
.setLocalOnly(true)
.setOngoing(true)
.setColor(getApplicationContext().getResources().getColor(R.color.primary))
.setCategory(Notification.CATEGORY_RECOMMENDATION)
.setLargeIcon(image)
.setSmallIcon(R.drawable.ic_stat_f)
.setContentIntent(pendingIntent)
.setExtras(extras))
.build();
mNotificationManager.notify(count, notification);
In order for this service to be recognized by the system and run, register it using your app manifest. The following code snippet illustrates how to declare this class as a service:
<manifest ... >
<application ... >
...
<service android:name="com.android.example.leanback.fastlane.RecommendationsService"
android:enabled="true" android:exported="true"/>
</application>
</manifest>
Your app's recommendation service must run periodically in order to create current recommendations. To run your service, create a class that runs a timer and invokes it at regular intervals. The following code example extends the BroadcastReceiver
class to start periodic execution of a recommendation service every 1/2 hour:
public class BootCompleteReceiver extends BroadcastReceiver {
private static final long INITIAL_DELAY = 5000;
public BootCompleteReceiver() {
}
@Override
public void onReceive(Context context, Intent intent) {
if (intent.getAction().endsWith(Intent.ACTION_BOOT_COMPLETED)) {
scheduleRecommendationUpdate(context);
}
}
private void scheduleRecommendationUpdate(Context context) {
AlarmManager alarmManager = (AlarmManager) context.getSystemService(Context.ALARM_SERVICE);
Intent recommendationIntent = new Intent(context, RecommendationsService.class);
PendingIntent alarmIntent = PendingIntent.getService(context, 0, recommendationIntent, 0);
alarmManager.setInexactRepeating(AlarmManager.ELAPSED_REALTIME_WAKEUP,
INITIAL_DELAY,
AlarmManager.INTERVAL_HALF_HOUR,
alarmIntent);
}
}
This implementation of the BroadcastReceiver
class must run after startup of the TV device where it is installed. To accomplish this, register this class in your app manifest with an intent filter that listens for the completion of the device boot process. The following code demonstrates how to add this configuration to the manifest.
<manifest ... >
<application ... >
<receiver android:name="com.android.example.leanback.fastlane.BootCompleteReceiver"
android:enabled="true"
android:exported="false">
<intent-filter>
<action android:name="android.intent.action.BOOT_COMPLETED"/>
</intent-filter>
</receiver>
</application>
</manifest>
Important: Receiving a boot completed notification requires that your app requests the RECEIVE_BOOT_COMPLETED
permission. For more information, see ACTION_BOOT_COMPLETED
.
Congrats, you've completed adding recommendations for your app.
Try running it and you should start seeing recommendations after 30 minutes. If you want to see something immediately, start the service through adb.
adb shell am startservice com.android.example.leanback/.fastlane.RecommendationsService
In this step you've learned about:
Adding polish animations and transitions.
We want to specify the background image as a user browses from item to item in the BrowseFragment
The Leanback library support developers in creating immersive TV experiences. This includes using large pictures in the background to improve the experience. In this step you will learn how to use the BackgroundManager
to change the background of the user interface according to the selection of the video in the UI. The BackgroundManager
supports background image continuity between multiple Activities. It should be noted that the BackgroundManager
holds references to potentially large bitmap Drawables. Call release
to release these references when the Activity
is not visible.
BackgroundManager
of the Leanback library.BackgroundHelper
Target
Under fastlane
create a class BackgroundHelper
which we are going to extend step by step to add functionality to change the background image for our TV activities.
public class BackgroundHelper {
private static long BACKGROUND_UPDATE_DELAY = 200;
private final Handler mHandler = new Handler();
private Activity mActivity;
private DisplayMetrics mMetrics;
private Timer mBackgroundTimer;
private String mBackgroundURL;
private Drawable mDefaultBackground;
private Target mBackgroundTarget;
public BackgroundHelper(Activity mActivity) {
this.mActivity = mActivity;
}
public void setBackgroundUrl(String backgroundUrl) {
this.mBackgroundURL = backgroundUrl;
}
}
The interface Target
of the Picasso library acts as a listener for the end of loading and manipulating images which is off the UI thread. We need a custom implementation of the Target
that sets the resulting bitmap as a background image. Note that is important to have a proper implementation of hashCode
and equals
for every Target
implementation.
static class PicassoBackgroundManagerTarget implements Target {
BackgroundManager mBackgroundManager;
public PicassoBackgroundManagerTarget(BackgroundManager backgroundManager) {
this.mBackgroundManager = backgroundManager;
}
@Override
public void onBitmapLoaded(Bitmap bitmap, Picasso.LoadedFrom loadedFrom) {
this.mBackgroundManager.setBitmap(bitmap);
}
@Override
public void onBitmapFailed(Drawable drawable) {
this.mBackgroundManager.setDrawable(drawable);
}
@Override
public void onPrepareLoad(Drawable drawable) {
// Do nothing, default_background manager has its own transitions
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
PicassoBackgroundManagerTarget that = (PicassoBackgroundManagerTarget) o;
return mBackgroundManager.equals(that.mBackgroundManager);
}
@Override
public int hashCode() {
return mBackgroundManager.hashCode();
}
}
We attach the Window
of the current activity to the BackgroundManager
and instantiate the PicassoBackgroundManagerTarget
. We set a default color for the background and get the metrics that describe the size and density of this display.
public void prepareBackgroundManager() {
BackgroundManager backgroundManager = BackgroundManager.getInstance(mActivity);
backgroundManager.attach(mActivity.getWindow());
mBackgroundTarget = new PicassoBackgroundManagerTarget(backgroundManager);
mDefaultBackground = ContextCompat.getDrawable(mActivity, R.drawable.default_background);
mMetrics = new DisplayMetrics();
mActivity.getWindowManager().getDefaultDisplay().getMetrics(mMetrics);
}
We are using Picasso to load and manipulate the image. Once done, the instance of the PicassoBackgroundManagerTarget
is used to apply the loaded image to the UI. The method also makes sure to cancel the timer to make sure only one timer is running.
protected void updateBackground(String url) {
Picasso.with(mActivity)
.load(url)
.resize(mMetrics.widthPixels, mMetrics.heightPixels)
.centerCrop()
.transform(BlurTransform.getInstance(mActivity))
.error(mDefaultBackground)
.into(mBackgroundTarget);
if (null != mBackgroundTimer) {
mBackgroundTimer.cancel();
}
}
This is a subclass of TimerTask
to be used to delay updating the background.
private class UpdateBackgroundTask extends TimerTask {
@Override
public void run() {
mHandler.post(new Runnable() {
@Override
public void run() {
if (mBackgroundURL != null) {
updateBackground(mBackgroundURL);
}
}
});
}
}
In this method UpdateBackgroundTask
is used to schedule a Timer
to update the background.
public void startBackgroundTimer() {
if (null != mBackgroundTimer) {
mBackgroundTimer.cancel();
}
mBackgroundTimer = new Timer();
mBackgroundTimer.schedule(new UpdateBackgroundTask(), BACKGROUND_UPDATE_DELAY);
}
This is an implementation of the interface com.squareup.picasso.Transformation
which we use to blur the image to be set as background. We start with auto-generated dummy implementations of the required methods transform
and key
.
public class BlurTransform implements Transformation {
@Override
public Bitmap transform(Bitmap source) {
return null;
}
@Override
public String key() {
return null;
}
}
We want the BlurTransformation
to exist only once, so we make it a Singleton
and instantiate a RenderScript
in the private constructor which takes a Context
as single argument.
RenderScript rs;
static BlurTransform blurTransform;
protected BlurTransform() {
// Exists only to defeat instantiation.
}
private BlurTransform(Context context) {
super();
rs = RenderScript.create(context);
}
public static BlurTransform getInstance(Context context) {
if (blurTransform == null) {
blurTransform = new BlurTransform(context);
}
return blurTransform;
}
The meat of this class is in the transform method which does the trick of blurring the image.
@Override
public Bitmap transform(Bitmap bitmap) {
// Create another bitmap that will hold the results of the filter.
Bitmap blurredBitmap = Bitmap.createBitmap(bitmap);
// Allocate memory for Renderscript to work with
Allocation input = Allocation.createFromBitmap(rs, bitmap, Allocation.MipmapControl.MIPMAP_FULL, Allocation.USAGE_SHARED);
Allocation output = Allocation.createTyped(rs, input.getType());
// Load up an instance of the specific script that we want to use.
ScriptIntrinsicBlur script = ScriptIntrinsicBlur.create(rs, Element.U8_4(rs));
script.setInput(input);
// Set the blur radius
script.setRadius(20);
// Start the ScriptIntrinsicBlur
script.forEach(output);
// Copy the output to the blurred bitmap
output.copyTo(blurredBitmap);
bitmap.recycle();
return blurredBitmap;
}
@Override
public String key() {
return "blur";
}
We complete the calls to the Picasso library in the updateBackground
method after the centerCrop
call.
.transform(BlurTransform.getInstance(mActivity))
These classes are not used anywhere by now. We add it first to the LeanbackBrowseFragment
. Add a member variable bgHelper
.
private BackgroundHelper bgHelper;
and instantiate it at the end of the init method of the LeanbackBrowseFragment
:
bgHelper = new BackgroundHelper(getActivity());
bgHelper.prepareBackgroundManager();
The BackgroundHelper
should be updated each time the user selects an item view. Create a factory method getDefaultSelectedListener
which does that:
protected OnItemViewSelectedListener getDefaultItemSelectedListener() {
return new OnItemViewSelectedListener() {
public void onItemSelected(Presenter.ViewHolder itemViewHolder, Object item,
RowPresenter.ViewHolder rowViewHolder, Row row) {
if (item instanceof Video) {
bgHelper.setBackgroundUrl(((Video) item).getThumbUrl());
bgHelper.startBackgroundTimer();
}
}
};
}
Now we just register the listener in the init method.
setOnItemViewSelectedListener(getDefaultItemSelectedListener());
The background should also be set when the details of a video are shown. So we apply this in the VideoDetailsFragment
as well. It's simpler here because we don't require a listener. It's just added to the onCreate()
method of the fragment.
Add the member variable:
BackgroundHelper bgHelper;
and use it in the onCreate
method.
bgHelper = new BackgroundHelper(getActivity());
bgHelper.prepareBackgroundManager();
bgHelper.updateBackground(selectedVideo.getThumbUrl());
Congrats, you've completed the codelab!
We have some bonus content on leveraging the PlaybackOverlayFragment
to easily add playback controls. The example for adding a PlaybackOverlayFragment
to an existing player and detecting appropriate playback controls can be found in checkpoint_6. The PlaybackOverlayFragment
also allows you to help users find related content in your app without stopping playback.
If you're looking for sample code for additional features, check out our full sample. It includes sample code for the search fragment, grid fragment and others.