我有一些代码:

  class WorkEngineTask extends AsyncTask<Void, RecognitionResult, Void> {
        @Override
        protected Void doInBackground(Void... unused) {
            while (true) {
                try {
                    frameReady.acquire();  // waiting for the frame

                    if (!processing) {
                        break;
                    }
                    Camera.Size size = camera.getParameters().getPreviewSize();
                    RecognitionResult result;
                    switch (mAngle) {
                        case 0:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.Landscape);
                            break;
                        case 180:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.InvertedLandscape);
                            break;
                        case 270:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.InvertedPortrait);
                            break;
                        default:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.Portrait);
                    }
                    publishProgress(result);  // show current result
                } catch (Exception e) {

                }
            }
            return null;
        }
        @Override
        protected void onProgressUpdate(RecognitionResult... res) {
            RecognitionResult result = res[0];
        }
    }


我需要将其转换为rxJava代码。
我正在尝试编写一些代码。这是我的代码:

 public void workEngine() {
        Observable.create((ObservableOnSubscribe<RecognitionResult>) emitter -> {
            while (true) {
                try {
                    frameReady.acquire();  // waiting for the frame

                    if (!processing) {
                        break;
                    }

                    Camera.Size size = camera.getParameters().getPreviewSize();
                    RecognitionResult result;

                    switch (mAngle) {
                        case 0:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.Landscape);
                            break;
                        case 180:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.InvertedLandscape);
                            break;
                        case 270:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.InvertedPortrait);
                            break;
                        default:
                            result = session.ProcessYUVSnapshot(mData, size.width, size.height, ImageOrientation.Portrait);
                    }
                    emitter.onNext(result);
                } catch (Exception e) {
                    String message = "Error while processing frame: " + e.toString();
                    Log.d("smartid", message);
                    callback.error(message);
                }
            }

        }).observeOn(Schedulers.io())
                .subscribeOn(AndroidSchedulers.mainThread())
                .subscribe(new Observer<RecognitionResult>() {
                    @Override
                    public void onSubscribe(Disposable d) {

                    }

                    @Override
                    public void onNext(RecognitionResult recognitionResult) {
                        draw.showResult(recognitionResult);
                        draw.invalidate();
                        callback.recognized(recognitionResult);
                        frameWaiting.release();
                    }

                    @Override
                    public void onComplete() {

                    }

                    @Override
                    public void onError(Throwable e) {
                        e.printStackTrace();
                    }
                });
    }


但这对我不起作用。我是Android的新手,我从不使用异步任务,也不想现在使用它。我尝试了很多次,但没有任何帮助。我不明白我在做什么错。请帮我解决这个问题。

最佳答案

无论您在AsyncTask中处理结果如何,都应在onNext()中进行;那就是观察到的结果。然后,一旦没有东西发出,就会调用onComplete()。因此,清理工作应在此处完成,例如发布呼叫。如果发生错误时想要记录日志之类的东西,请在onError()中进行。

 @Override
                        public void onNext(RecognitionResult recognitionResult) {
                            //This doesn't match what you did in the AsyncTask above
                            draw.showResult(recognitionResult);
                            draw.invalidate();
                            callback.recognized(recognitionResult);

                        }

                        @Override
                        public void onComplete() {
                             frameWaiting.release();
                        }

09-27 12:00