我正在尝试向已经运行的应用程序添加可视化波形,以记录声音,然后播放。录制/播放代码可以正常工作,但是当我尝试添加com.pheelicks.app可视化器并录制自己的声音,然后尝试播放时,按我自己的播放按钮会崩溃。
在他的可视化器应用程序中,他提供了自己的mp3声音文件,可通过MediaPlayer播放。但是由于我也通过MediaPlayer播放录制的声音,因此我认为调整代码以包括其可视化器部分应该很容易。但是出了点问题。
奇怪的是,他的代码可以在我的Android手机(三星Galaxy 5)中完美运行,我可以看到音乐的可视化器。
我曾尝试研究类似的问题,但没有找到答案。我已经尝试了this和this,它们都是类似的错误,但是我没有找到解决方案。
错误似乎来自他的代码,即Bitmap部分中的VisualizerView.java,其中在getWidth()
方法中从getHeight()
调用Canvas
和onDraw()
。我也尝试记录这些值,但它们未显示在我的LogCat
中。我正在使用Android Studio。谢谢你的帮助!
编辑:
啊,我看到我的Log
语句不起作用,因为它被放置在崩溃点之后。我将其放在崩溃点之前:
if (mCanvasBitmap == null) {
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(),
canvas.getHeight(), Config.ARGB_8888);
}
我是从LogCat获得的:
Value of getWidth is 924 and value of getHeight is 0
。所以问题是,为什么高度为零?VisualizerView.java
package org.azurespot.waveform;
/**
* Created by mizu on 2/2/15.
*/
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Bitmap.Config;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Matrix;
import android.graphics.Paint;
import android.graphics.PorterDuff.Mode;
import android.graphics.PorterDuffXfermode;
import android.graphics.Rect;
import android.media.MediaPlayer;
import android.media.audiofx.Visualizer;
import android.util.AttributeSet;
import android.view.View;
import java.util.HashSet;
import java.util.Set;
import static android.util.Log.d;
/**
* A class that draws visualizations of data received from a
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture } and
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
*/
public class VisualizerView extends View {
private static final String TAG = "VisualizerView";
private byte[] mBytes;
private byte[] mFFTBytes;
private Rect mRect = new Rect();
private Visualizer mVisualizer;
private Set<Renderer> mRenderers;
private Paint mFlashPaint = new Paint();
private Paint mFadePaint = new Paint();
public VisualizerView(Context context, AttributeSet attrs, int defStyle) {
super(context, attrs);
init();
}
public VisualizerView(Context context, AttributeSet attrs) {
this(context, attrs, 0);
}
public VisualizerView(Context context) {
this(context, null, 0);
}
private void init() {
mBytes = null;
mFFTBytes = null;
mFlashPaint.setColor(Color.argb(122, 255, 255, 255));
mFadePaint.setColor(Color.argb(238, 255, 255, 255)); // Adjust alpha to change how quickly the image fades
mFadePaint.setXfermode(new PorterDuffXfermode(Mode.MULTIPLY));
mRenderers = new HashSet<Renderer>();
}
/**
* Links the visualizer to a player
*
* @param player - MediaPlayer instance to link to
*/
public void link(MediaPlayer player) {
if (player == null) {
throw new NullPointerException("Cannot link to null MediaPlayer");
}
// Create the Visualizer object and attach it to our media player.
mVisualizer = new Visualizer(player.getAudioSessionId());
mVisualizer.setEnabled(false);
mVisualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
// Pass through Visualizer data to VisualizerView
Visualizer.OnDataCaptureListener captureListener = new Visualizer.OnDataCaptureListener() {
@Override
public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate) {
updateVisualizer(bytes);
}
@Override
public void onFftDataCapture(Visualizer visualizer, byte[] bytes,
int samplingRate) {
updateVisualizerFFT(bytes);
}
};
mVisualizer.setDataCaptureListener(captureListener,
Visualizer.getMaxCaptureRate() / 2, true, true);
// Enabled Visualizer and disable when we're done with the stream
mVisualizer.setEnabled(true);
player.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer mediaPlayer) {
mVisualizer.setEnabled(false);
}
});
}
public void addRenderer(Renderer renderer) {
if (renderer != null) {
mRenderers.add(renderer);
}
}
public void clearRenderers() {
mRenderers.clear();
}
/**
* Call to release the resources used by VisualizerView. Like with the
* MediaPlayer it is good practice to call this method
*/
public void release() {
mVisualizer.release();
}
/**
* Pass data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onWaveFormDataCapture }
*
* @param bytes
*/
public void updateVisualizer(byte[] bytes) {
mBytes = bytes;
invalidate();
}
/**
* Pass FFT data to the visualizer. Typically this will be obtained from the
* Android Visualizer.OnDataCaptureListener call back. See
* {@link Visualizer.OnDataCaptureListener#onFftDataCapture }
*
* @param bytes
*/
public void updateVisualizerFFT(byte[] bytes) {
mFFTBytes = bytes;
invalidate();
}
boolean mFlash = false;
/**
* Call this to make the visualizer flash. Useful for flashing at the start
* of a song/loop etc...
*/
public void flash() {
mFlash = true;
invalidate();
}
Bitmap mCanvasBitmap;
Canvas mCanvas;
@Override
protected void onDraw(Canvas canvas) {
super.onDraw(canvas);
// Create canvas once we're ready to draw
mRect.set(0, 0, getWidth(), getHeight());
if (mCanvasBitmap == null) {
mCanvasBitmap = Bitmap.createBitmap(canvas.getWidth(),
canvas.getHeight(), Config.ARGB_8888);
}
d("DEBUG", "Value of getWidth is " + canvas.getWidth()
+ " and value of getHeight is " + canvas.getHeight());
if (mCanvas == null) {
mCanvas = new Canvas(mCanvasBitmap);
}
if (mBytes != null) {
// Render all audio renderers
AudioData audioData = new AudioData(mBytes);
for (Renderer r : mRenderers) {
r.render(mCanvas, audioData, mRect);
}
}
if (mFFTBytes != null) {
// Render all FFT renderers
FFTData fftData = new FFTData(mFFTBytes);
for (Renderer r : mRenderers) {
r.render(mCanvas, fftData, mRect);
}
}
// Fade out old contents
mCanvas.drawPaint(mFadePaint);
if (mFlash) {
mFlash = false;
mCanvas.drawPaint(mFlashPaint);
}
canvas.drawBitmap(mCanvasBitmap, new Matrix(), null);
}
}
LogCat
02-02 22:31:16.699 19125-19125/? E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: org.azurespot, PID: 19125
java.lang.IllegalArgumentException: width and height must be > 0
at android.graphics.Bitmap.createBitmap(Bitmap.java:922)
at android.graphics.Bitmap.createBitmap(Bitmap.java:901)
at android.graphics.Bitmap.createBitmap(Bitmap.java:868)
at org.azurespot.waveform.VisualizerView.onDraw(VisualizerView.java:176)
at android.view.View.draw(View.java:15393)
at android.view.View.getDisplayList(View.java:14287)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.ViewGroup.dispatchGetDisplayList(ViewGroup.java:3284)
at android.view.View.getDisplayList(View.java:14224)
at android.view.View.getDisplayList(View.java:14329)
at android.view.HardwareRenderer$GlRenderer.buildDisplayList(HardwareRenderer.java:1576)
at android.view.HardwareRenderer$GlRenderer.draw(HardwareRenderer.java:1455)
at android.view.ViewRootImpl.draw(ViewRootImpl.java:2754)
at android.view.ViewRootImpl.performDraw(ViewRootImpl.java:2620)
at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2188)
at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1249)
at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:6585)
at android.view.Choreographer$CallbackRecord.run(Choreographer.java:803)
at android.view.Choreographer.doCallbacks(Choreographer.java:603)
at android.view.Choreographer.doFrame(Choreographer.java:573)
at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:789)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5579)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1268)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1084)
at dalvik.system.NativeStart.main(Native Method)
最佳答案
原来,在我的xml中包含我的FrameLayout
小部件的VisualizerView
是罪魁祸首!代码是从com.pheelicks.app发出的,因此我什么也没想到,因为我已经看到FrameLayout
在0dp
之前(在片段中)显示了这种大小。但是后来偶然的机会,我决定将其放大,并且可视化器确实出现了。这些小东西真是太神奇了。千方百计!下面是我的xml中的小部件,我将高度更改为200dp
(从0dp
),并进行了修复。
activity_make_sounds.xml
<FrameLayout
android:layout_width="fill_parent"
android:layout_height="200dp"
android:layout_margin="10dp"
android:background="#000" >
<org.azurespot.waveform.VisualizerView
android:id="@+id/visualizerView"
android:layout_width="fill_parent"
android:layout_height="fill_parent" >
</org.azurespot.waveform.VisualizerView>
</FrameLayout>