如何播放长音频剪辑?

Posted

技术标签:

【中文标题】如何播放长音频剪辑?【英文标题】:How do you play a long AudioClip? 【发布时间】:2012-02-27 18:24:42 【问题描述】:

我编写了一个简单的类来在一个简单的游戏中播放音频文件。它适用于枪声或爆炸等小声音,但当我尝试将它用于背景音乐时,我收到此错误:“无法分配剪辑数据:请求的缓冲区太大。”我假设这意味着文件太大,但我该如何解决这个问题? 来源:

import java.io.File;
import javax.sound.sampled.AudioInputStream;
import javax.sound.sampled.Audiosystem;
import javax.sound.sampled.Clip;

public class Sound

private Clip clip;

public Sound(String filepath)
    System.out.println(filepath);
    File file = new File(filepath);
    try 
        clip = AudioSystem.getClip();
        AudioInputStream inputStream = AudioSystem.getAudioInputStream(file);
        clip.open(inputStream);
     catch (Exception e) 
        System.err.println(e.getMessage());
    


public void play()
    System.out.println("play");
    if(clip.isActive())
        clip.stop();
    
    clip.setFramePosition(0);
    clip.start();


public void stop()
    clip.stop();


public void loop()
    if(!clip.isActive())
        clip.setFramePosition(0);
        clip.loop(Clip.LOOP_CONTINUOUSLY);
    else
        System.out.println("ALREADY PLAYING");
    



public boolean getActive()return clip.isActive();

【问题讨论】:

SourceDataLine 通常用于较长的声音或背景音乐。我很好奇你为什么选择不使用它。 【参考方案1】:

使用BigClip。这是我放在一起播放 12-18 分钟(或更多1)的 MP3 的课程。

它需要运行时类路径上的mp3plugin.jar 才能实际加载 MP3 格式的声音,但这不是重点。重点是:

    BigClip 会将声音文件加载到 JVM 在 OutOfMemoryError 之前允许的最大内存。

import java.awt.Component;
import javax.swing.*;
import javax.sound.sampled.*;
import java.io.*;
import java.util.logging.*;
import java.util.Arrays;

import java.net.URL;
import javax.swing.JOptionPane;

class BigClipExample 

    public static void main(String[] args) throws Exception 
        URL url = new URL("http://pscode.org/media/leftright.wav");
        BigClip clip = new BigClip();
        AudioInputStream ais = AudioSystem.getAudioInputStream(url);
        clip.open(ais);
        clip.start();
        JOptionPane.showMessageDialog(null, "BigClip.start()");
        clip.loop(4);
        JOptionPane.showMessageDialog(null, "BigClip.loop(4)");
        clip.setFastForward(true);
        clip.loop(8);
        // the looping/FF combo. reveals a bug..
        // there is a slight 'click' in the sound that should not be audible
        JOptionPane.showMessageDialog(null, "Are you on speed?");
    


/** An implementation of the javax.sound.sampled.Clip that is designed
to handle Clips of arbitrary size, limited only by the amount of memory
available to the app.    It uses the post 1.4 thread behaviour (daemon thread)
that will stop the sound running after the main has exited.
<ul>
<li>2012-02-29 - Reworked play/loop to fix several bugs.
<li>2009-09-01 - Fixed bug that had clip ..clipped at the end, by calling drain() (before
calling stop()) on the dataline after the play loop was complete. Improvement to frame
and microsecond position determination.
<li>2009-08-17 - added convenience constructor that accepts a Clip. Changed the private
convertFrameToM..seconds methods from 'micro' to 'milli' to reflect that they were dealing
with units of 1000/th of a second.
<li>2009-08-14 - got rid of flush() after the sound loop, as it was cutting off tracks just
before the end, and was found to be not needed for the fast-forward/rewind functionality it
was introduced to support.
<li>2009-08-11 - First binary release.
</ul>
N.B. Remove @Override notation and logging to use in 1.3+
@since 1.5
@version 2012-02-29
@author Andrew Thompson 
@author Alejandro Garcia */
class BigClip implements Clip, LineListener 

    /** The DataLine used by this Clip. */
    private SourceDataLine dataLine;

    /** The raw bytes of the audio data. */
    private byte[] audioData;

    /** The stream wrapper for the audioData. */
    private ByteArrayInputStream inputStream;

    /** Loop count set by the calling code. */
    private int loopCount = 1;
    /** Internal count of how many loops to go. */
    private int countDown = 1;
    /** The start of a loop point.    Defaults to 0. */
    private int loopPointStart;
    /** The end of a loop point.    Defaults to the end of the Clip. */
    private int loopPointEnd;

    /** Stores the current frame position of the clip. */
    private int framePosition;

    /** Thread used to run() sound. */
    private Thread thread;
    /** Whether the sound is currently playing or active. */
    private boolean active;
    /** Stores the last time bytes were dumped to the audio stream. */
    private long timelastPositionSet;

    private int bufferUpdateFactor = 2;

    /** The parent Component for the loading progress dialog.    */
    Component parent = null;

    /** Used for reporting messages. */
    private Logger logger = Logger.getAnonymousLogger();

    /** Default constructor for a BigClip.    Does nothing.    Information from the
    AudioInputStream passed in open() will be used to get an appropriate SourceDataLine. */
    public BigClip() 

    /** There are a number of AudioSystem methods that will return a configured Clip.    This
    convenience constructor allows us to obtain a SourceDataLine for the BigClip that uses
    the same AudioFormat as the original Clip.
    @param clip Clip The Clip used to configure the BigClip. */
    public BigClip(Clip clip) throws LineUnavailableException 
        dataLine = AudioSystem.getSourceDataLine( clip.getFormat() );
    

    /** Provides the entire audio buffer of this clip.
    @return audioData byte[] The bytes of the audio data that is loaded in this Clip. */
    public byte[] getAudioData() 
        return audioData;
    

    /** Sets a parent component to act as owner of a "Loading track.." progress dialog.
    If null, there will be no progress shown. */
    public void setParentComponent(Component parent) 
        this.parent = parent;
    

    /** Converts a frame count to a duration in milliseconds. */
    private long convertFramesToMilliseconds(int frames) 
        return (frames/(long)dataLine.getFormat().getSampleRate())*1000;
    

    /** Converts a duration in milliseconds to a frame count. */
    private int convertMillisecondsToFrames(long milliseconds) 
        return (int)(milliseconds/dataLine.getFormat().getSampleRate());
    

    @Override
    public void update(LineEvent le) 
        logger.log(Level.FINEST, "update: " + le );
    

    @Override
    public void loop(int count) 
        logger.log(Level.FINEST, "loop(" + count + ") - framePosition: " + framePosition);
        loopCount = count;
        countDown = count;
        active = true;
        inputStream.reset();

        start();
    

    @Override
    public void setLoopPoints(int start, int end) 
        if (
            start<0 ||
            start>audioData.length-1 ||
            end<0 ||
            end>audioData.length
            ) 
            throw new IllegalArgumentException(
                "Loop points '" +
                start +
                "' and '" +
                end +
                "' cannot be set for buffer of size " +
                audioData.length);
        
        if (start>end) 
            throw new IllegalArgumentException(
                "End position " +
                end +
                " preceeds start position " + start);
        

        loopPointStart = start;
        framePosition = loopPointStart;
        loopPointEnd = end;
    

    @Override
    public void setMicrosecondPosition(long milliseconds) 
        framePosition = convertMillisecondsToFrames(milliseconds);
    

    @Override
    public long getMicrosecondPosition() 
        return convertFramesToMilliseconds(getFramePosition());
    

    @Override
    public long getMicrosecondLength() 
        return convertFramesToMilliseconds(getFrameLength());
    

    @Override
    public void setFramePosition(int frames) 
        framePosition = frames;
        int offset = framePosition*format.getFrameSize();
        try 
            inputStream.reset();
            inputStream.read(new byte[offset]);
         catch(Exception e) 
            e.printStackTrace();
        
    

    @Override
    public int getFramePosition() 
        long timeSinceLastPositionSet = System.currentTimeMillis() - timelastPositionSet;
        int size = dataLine.getBufferSize()*(format.getChannels()/2)/bufferUpdateFactor;
        int framesSinceLast = (int)((timeSinceLastPositionSet/1000f)*
            dataLine.getFormat().getFrameRate());
        int framesRemainingTillTime = size - framesSinceLast;
        return framePosition
            - framesRemainingTillTime;
    

    @Override
    public int getFrameLength() 
        return audioData.length/format.getFrameSize();
    

    AudioFormat format;

    @Override
    public void open(AudioInputStream stream) throws
        IOException,
        LineUnavailableException 

        AudioInputStream is1;
        format = stream.getFormat();

        if (format.getEncoding()!=AudioFormat.Encoding.PCM_SIGNED) 
            is1 = AudioSystem.getAudioInputStream(
                AudioFormat.Encoding.PCM_SIGNED, stream );
         else 
            is1 = stream;
        
        format = is1.getFormat();
        InputStream is2;
        if (parent!=null) 
            ProgressMonitorInputStream pmis = new ProgressMonitorInputStream(
                parent,
                "Loading track..",
                is1);
            pmis.getProgressMonitor().setMillisToPopup(0);
            is2 = pmis;
         else 
            is2 = is1;
        

        byte[] buf = new byte[ 2^16 ];
        int totalRead = 0;
        int numRead = 0;
        ByteArrayOutputStream baos = new ByteArrayOutputStream();
        numRead = is2.read( buf );
        while (numRead>-1) 
            baos.write( buf, 0, numRead );
            numRead = is2.read( buf, 0, buf.length );
            totalRead += numRead;
        
        is2.close();
        audioData = baos.toByteArray();
        AudioFormat afTemp;
        if (format.getChannels()<2) 
            afTemp = new AudioFormat(
                format.getEncoding(),
                format.getSampleRate(),
                format.getSampleSizeInBits(),
                2,
                format.getSampleSizeInBits()*2/8, // calculate frame size
                format.getFrameRate(),
                format.isBigEndian()
                );
         else 
            afTemp = format;
        

        setLoopPoints(0,audioData.length);
        dataLine = AudioSystem.getSourceDataLine(afTemp);
        dataLine.open();
        inputStream = new ByteArrayInputStream( audioData );
    

    @Override
    public void open(AudioFormat format,
        byte[] data,
        int offset,
        int bufferSize)
        throws LineUnavailableException 
        byte[] input = new byte[bufferSize];
        for (int ii=0; ii<input.length; ii++) 
            input[ii] = data[offset+ii];
        
        ByteArrayInputStream inputStream = new ByteArrayInputStream(input);
        try 
            AudioInputStream ais1 = AudioSystem.getAudioInputStream(inputStream);
            AudioInputStream ais2 = AudioSystem.getAudioInputStream(format, ais1);
            open(ais2);
         catch( UnsupportedAudioFileException uafe ) 
            throw new IllegalArgumentException(uafe);
         catch( IOException ioe ) 
            throw new IllegalArgumentException(ioe);
        
        // TODO    -    throw IAE for invalid frame size, format.
    

    @Override
    public float getLevel() 
        return dataLine.getLevel();
    

    @Override
    public long getLongFramePosition() 
        return dataLine.getLongFramePosition()*2/format.getChannels();
    

    @Override
    public int available() 
        return dataLine.available();
    

    @Override
    public int getBufferSize() 
        return dataLine.getBufferSize();
    

    @Override
    public AudioFormat getFormat() 
        return format;
    

    @Override
    public boolean isActive() 
        return dataLine.isActive();
    

    @Override
    public boolean isRunning() 
        return dataLine.isRunning();
    

    @Override
    public boolean isOpen() 
        return dataLine.isOpen();
    

    @Override
    public void stop() 
        logger.log(Level.FINEST, "BigClip.stop()");
        active = false;
        // why did I have this commented out?
        dataLine.stop();
        if (thread!=null) 
            try 
                active = false;
                thread.join();
             catch(InterruptedException wakeAndContinue) 
            
        
    

    public byte[] convertMonoToStereo(byte[] data, int bytesRead) 
        byte[] tempData = new byte[bytesRead*2];
        if (format.getSampleSizeInBits()==8) 
            for(int ii=0; ii<bytesRead; ii++) 
                byte b = data[ii];
                tempData[ii*2] = b;
                tempData[ii*2+1] = b;
            
         else 
            for(int ii=0; ii<bytesRead-1; ii+=2) 
                //byte b2 = is2.read();
                byte b1 = data[ii];
                byte b2 = data[ii+1];
                tempData[ii*2] = b1;
                tempData[ii*2+1] = b2;
                tempData[ii*2+2] = b1;
                tempData[ii*2+3] = b2;
            
        
        return tempData;
    

    boolean fastForward;
    boolean fastRewind;

    public void setFastForward(boolean fastForward) 
        logger.log(Level.FINEST, "FastForward " + fastForward);
        this.fastForward = fastForward;
        fastRewind = false;
        flush();
    

    public boolean getFastForward() 
        return fastForward;
    

    public void setFastRewind(boolean fastRewind) 
        logger.log(Level.FINEST, "FastRewind " + fastRewind);
        this.fastRewind = fastRewind;
        fastForward = false;
        flush();
    

    public boolean getFastRewind() 
        return fastRewind;
    

    /** TODO - fix bug in LOOP_CONTINUOUSLY */
    @Override
    public void start() 
        Runnable r = new Runnable() 
            public void run() 
                try 
                    /* Should these open()/close() calls be here, or explicitly
                    called by user program?    The JavaDocs for line suggest that
                    Clip should throw an IllegalArgumentException, so we'll
                    stick with that and call it explicitly. */
                    dataLine.open();

                    dataLine.start();

                    active = true;

                    int bytesRead = 0;
                    int frameSize = dataLine.getFormat().getFrameSize();
                    int bufSize = dataLine.getBufferSize();
                    boolean startOrMove = true;
                    byte[] data = new byte[bufSize];
                    int offset = framePosition*frameSize;
                    int totalBytes = offset;
                    bytesRead = inputStream.read(new byte[offset], 0, offset);
                    logger.log(Level.FINE, "bytesRead " + bytesRead );
                    bytesRead = inputStream.read(data,0,data.length);

                    logger.log(Level.FINE, "loopCount " + loopCount );
                    logger.log(Level.FINE, "countDown " + countDown );
                    logger.log(Level.FINE, "bytesRead " + bytesRead );

                    while (bytesRead != -1 &&
                        (loopCount==Clip.LOOP_CONTINUOUSLY ||
                        countDown>0) &&
                        active ) 
                        logger.log(Level.FINEST,
                            "BigClip.start() loop " + framePosition );
                        totalBytes += bytesRead;
                        int framesRead;
                        byte[] tempData;
                        if (format.getChannels()<2) 
                            tempData = convertMonoToStereo(data, bytesRead);
                            framesRead = bytesRead/
                                format.getFrameSize();
                            bytesRead*=2;
                         else 
                            framesRead = bytesRead/
                                dataLine.getFormat().getFrameSize();
                            tempData = Arrays.copyOfRange(data, 0, bytesRead);
                        
                        framePosition += framesRead;
                        if (framePosition>=loopPointEnd) 
                            framePosition = loopPointStart;
                            inputStream.reset();
                            countDown--;
                            logger.log(Level.FINEST,
                                "Loop Count: " + countDown );
                        
                        timelastPositionSet = System.currentTimeMillis();
                        byte[] newData;
                        if (fastForward) 
                            newData = getEveryNthFrame(tempData, 2);
                         else if (fastRewind) 
                            byte[] temp = getEveryNthFrame(tempData, 2);
                            newData = reverseFrames(temp);
                            inputStream.reset();
                            totalBytes -= 2*bytesRead;
                        framePosition -= 2*framesRead;
                            if (totalBytes<0) 
                                setFastRewind(false);
                                totalBytes = 0;
                            
                            inputStream.skip(totalBytes);
                            logger.log(Level.FINE, "totalBytes " + totalBytes);
                         else 
                            newData = tempData;
                        
                        dataLine.write(newData, 0, newData.length);
                        if (startOrMove) 
                            data = new byte[bufSize/
                                bufferUpdateFactor];
                            startOrMove = false;
                        
                        bytesRead = inputStream.read(data,0,data.length);
                        if (bytesRead<0 && countDown-->1) 
                            inputStream.read(new byte[offset], 0, offset);
                            logger.log(Level.FINE, "loopCount " + loopCount );
                            logger.log(Level.FINE, "countDown " + countDown );
                            inputStream.reset();
                            bytesRead = inputStream.read(data,0,data.length);
                        
                    
                    logger.log(Level.FINEST,
                        "BigClip.start() loop ENDED" + framePosition );
                    active = false;
                    countDown = 1;
                    framePosition = 0;
                    inputStream.reset();
                    dataLine.drain();
                    dataLine.stop();
                    /* should these open()/close() be here, or explicitly
                    called by user program? */
                    dataLine.close();
                 catch (LineUnavailableException lue) 
                    logger.log( Level.SEVERE,
                        "No sound line available!", lue );
                    if (parent!=null) 
                        JOptionPane.showMessageDialog(
                            parent,
                            "Clear the sound lines to proceed",
                            "No audio lines available!",
                            JOptionPane.ERROR_MESSAGE);
                    
                
            
        ;
        thread= new Thread(r);
        // makes thread behaviour compatible with JavaSound post 1.4
        thread.setDaemon(true);
        thread.start();
    

    /** Assume the frame size is 4. */
    public byte[] reverseFrames(byte[] data) 
        byte[] reversed = new byte[data.length];
        byte[] frame = new byte[4];

        for (int ii=0; ii<data.length/4; ii++) 
            int first = (data.length)-((ii+1)*4)+0;
            int last = (data.length)-((ii+1)*4)+3;
            frame[0] = data[first];
            frame[1] = data[(data.length)-((ii+1)*4)+1];
            frame[2] = data[(data.length)-((ii+1)*4)+2];
            frame[3] = data[last];

            reversed[ii*4+0] = frame[0];
            reversed[ii*4+1] = frame[1];
            reversed[ii*4+2] = frame[2];
            reversed[ii*4+3] = frame[3];
            if (ii<5 || ii>(data.length/4)-5) 
                logger.log(Level.FINER, "From \t" + first + " \tlast " + last );
                logger.log(Level.FINER, "To \t" + ((ii*4)+0) + " \tlast " + ((ii*4)+3) );
            
        

/*
        for (int ii=0; ii<data.length; ii++) 
            reversed[ii] = data[data.length-1-ii];
        
*/

        return reversed;
    

    /** Assume the frame size is 4. */
    public byte[] getEveryNthFrame(byte[] data, int skip) 
        int length = data.length/skip;
        length = (length/4)*4;
        logger.log(Level.FINEST, "length " + data.length + " \t" + length);
        byte[] b = new byte[length];
        //byte[] frame = new byte[4];
        for (int ii=0; ii<b.length/4; ii++) 
            b[ii*4+0] = data[ii*skip*4+0];
            b[ii*4+1] = data[ii*skip*4+1];
            b[ii*4+2] = data[ii*skip*4+2];
            b[ii*4+3] = data[ii*skip*4+3];
        
        return b;
    

    @Override
    public void flush() 
        dataLine.flush();
    

    @Override
    public void drain() 
        dataLine.drain();
    

    @Override
    public void removeLineListener(LineListener listener) 
        dataLine.removeLineListener(listener);
    

    @Override
    public void addLineListener(LineListener listener) 
        dataLine.addLineListener(listener);
    

    @Override
    public Control getControl(Control.Type control) 
        return dataLine.getControl(control);
    

    @Override
    public Control[] getControls() 
        if (dataLine==null) 
            return new Control[0];
         else 
            return dataLine.getControls();
        
    

    @Override
    public boolean isControlSupported(Control.Type control) 
        return dataLine.isControlSupported(control);
    

    @Override
    public void close() 
        dataLine.close();
    

    @Override
    public void open() throws LineUnavailableException 
        throw new IllegalArgumentException("illegal call to open() in interface Clip");
    

    @Override
    public Line.Info getLineInfo() 
        return dataLine.getLineInfo();
    

    /** Determines the single largest sample size of all channels of the current clip.
    This can be handy for determining a fraction to scal visual representations.
    @return Double between 0 & 1 representing the maximum signal level of any channel. */
    public double getLargestSampleSize() 

        int largest = 0;
        int current;

        boolean signed = (format.getEncoding()==AudioFormat.Encoding.PCM_SIGNED);
        int bitDepth = format.getSampleSizeInBits();
        boolean bigEndian = format.isBigEndian();

        int samples = audioData.length*8/bitDepth;

        if (signed) 
            if (bitDepth/8==2) 
                if (bigEndian) 
                    for (int cc = 0; cc < samples; cc++) 
                        current = (audioData[cc*2]*256 + (audioData[cc*2+1] & 0xFF));
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                    
                 else 
                    for (int cc = 0; cc < samples; cc++) 
                        current = (audioData[cc*2+1]*256 + (audioData[cc*2] & 0xFF));
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                    
                
             else 
                for (int cc = 0; cc < samples; cc++) 
                    current = (audioData[cc] & 0xFF);
                    if (Math.abs(current)>largest) 
                        largest = Math.abs(current);
                    
                
            
         else 
            if (bitDepth/8==2) 
                if (bigEndian) 
                    for (int cc = 0; cc < samples; cc++) 
                        current = (audioData[cc*2]*256 + (audioData[cc*2+1] - 0x80));
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                    
                 else 
                    for (int cc = 0; cc < samples; cc++) 
                        current = (audioData[cc*2+1]*256 + (audioData[cc*2] - 0x80));
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                    
                
             else 
                for (int cc = 0; cc < samples; cc++) 
                    if ( audioData[cc]>0 ) 
                        current = (audioData[cc] - 0x80);
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                     else 
                        current = (audioData[cc] + 0x80);
                        if (Math.abs(current)>largest) 
                            largest = Math.abs(current);
                        
                    
                
            
        

        // audioData
        logger.log(Level.FINEST, "Max signal level: " + (double)largest/(Math.pow(2, bitDepth-1)));
        return (double)largest/(Math.pow(2, bitDepth-1));
    

【讨论】:

希望虚拟机中有足够的内存来加载大文件并不是特别明智,也没有必要。发明缓冲区是有原因的…… “抱有希望并不是特别明智。” 这就是我不抱希望的原因。我采取了预防措施,并声明了一个备用缓冲区byte[],可以在紧急情况下清除。如果您知道如何从 OOME 中恢复,并且用例允许,则很容易从 OOME 中恢复。 不是故意踩你的脚趾的。在注意到您的声誉和个人资料后,我确定您的 BigClip 是安全的 :) 我希望那是个玩笑。我已经 11 年进入 Java 并且仍在学习,仍在犯错误。当我停止犯错时,它会变得很无聊,我会找到别的东西来娱乐自己。顺便说一句,我正要戳戳你的答案并挑战你,然后让它循环(如在 OP 的代码中)以获得不可重新定位的 InputStream(在它们开始之前填充一些 Clip 实例) )。 JavaSound,尤其是Clip 接口,是一个古怪的野兽。 ;) @user1150769 老实说,我需要为它编写代码已经有一段时间了(尽管我正在听音乐现在,使用类)。嗯..我在那里放了一个默认构造函数,所以BigClip clip = new BigClip() 应该是诀窍。任何更深层次的问题&我可能不得不深入研究 DukeBox 代码,看看我是如何做到的! “我必须将它编译到一个 jar 中,还是我可以将它作为我的项目中的一个类” 哪个最适合你。如果有一个预先构建的罐子,我会说去。事实上,我只是将源代码包含在手头的项目中。【参考方案2】:

谷歌把我带到这里:http://docs.oracle.com/javase/tutorial/sound/sampled-overview.html 在浏览了前三个部分后,我能够将它们放在一起:

   import javax.sound.sampled.*;
   import javax.sound.*;
   import java.io.*;

   public class Playme 

       Playme(String filename)

           int total, totalToRead, numBytesRead, numBytesToRead;
           byte[] buffer;
           boolean         stopped;
           AudioFormat     wav;
           TargetDataLine  line;
           SourceDataLine  lineIn;
           DataLine.Info   info;
           File            file;
           FileInputStream fis;

           //AudioFormat(float sampleRate, int sampleSizeInBits, 
           //int channels, boolean signed, boolean bigEndian) 
           wav = new AudioFormat(44100, 16, 2, true, false);
           info = new DataLine.Info(SourceDataLine.class, wav);


           buffer = new byte[1024*333];
           numBytesToRead = 1024*333;
           total=0;
           stopped = false;

           if (!AudioSystem.isLineSupported(info)) 
               System.out.print("no support for " + wav.toString() );
           
           try 
               // Obtain and open the line.
               lineIn = (SourceDataLine) AudioSystem.getLine(info);
               lineIn.open(wav);
               lineIn.start();
               fis = new FileInputStream(file = new File(filename));
               totalToRead = fis.available();



               while (total < totalToRead && !stopped)
                   numBytesRead = fis.read(buffer, 0, numBytesToRead);
                   if (numBytesRead == -1) break;
                   total += numBytesRead;
                   lineIn.write(buffer, 0, numBytesRead);
               

            catch (LineUnavailableException ex) 
               ex.printStackTrace();
            catch (FileNotFoundException nofile) 
               nofile.printStackTrace();
            catch (IOException io) 
               io.printStackTrace();
           
   

           public static void main(String[] argv) 
               Playme mb_745 = new Playme(argv[0]);
               //Playme mb_745 = new Playme("/R/tmp/tmp/audiodump.wav");

           
   

请注意,可能存在以下错误:

numBytesToRead = 1024*333;

因为 SourceDataLine.write 的 javadoc 说:

The number of bytes to write must represent 
an integral number of sample frames, such that:
[ bytes written ] % [frame size in bytes ] == 0

进入new AudioFormat(44100, 16, 2, true, false); 的信息来自:

$ file /R/tmp/tmp/audiodump.wav 
/R/tmp/tmp/audiodump.wav: RIFF (little-endian) data, 
WAVE audio, Microsoft PCM, 16 bit, stereo 44100 Hz

所以现在我通过运行来听到这个巨大的 745MB wav:

javac Playme.java &&  java Playme /R/tmp/tmp/audiodump.wav 

希望你觉得这很有用,祝你好运!

【讨论】:

当我第一次阅读 OP 的问题时,我看到了 Clip 并得出结论,他们实际上 需要 Clip。我不再那么肯定了。如果没有,这绝对是要走的路。也许 OP 可以澄清为什么他们认为他们需要一个 Clip 开始。如果它纯粹是为了循环整个音轨,那么在这个例子中工作应该是微不足道的,并且与使用BigClip相比,它需要的内存开销要少得多。 我使用Clip的原因,正如你所说,是因为它循环简单,而且很容易重放小声音,但如果有更好的方法,那么我会尝试那个。【参考方案3】:

我修复了建议的 BigClip 代码中的一些错误:修复了帧微秒错误,反之亦然,并将私有 convertFrameToM..seconds 方法从“milli”改回“micro”。现在 getMicrosecondPosition() 和 setMicrosecondPosition() 工作正常。现在 getMicrosecondPosition()、setMicrosecondPosition()、getMicrosecondLength() 工作正常。

.......................
    /** Converts a frame count to a duration in milliseconds. */
    private long convertFramesToMicrosecond(int frames) 
        return (long)(frames / dataLine.getFormat().getSampleRate() * 1000000);
    

    /** Converts a duration in milliseconds to a frame count. */
    private int convertMicrosecondToFrames(long microsecond) 
        return (int) (microsecond / 1000000.0 * dataLine.getFormat().getSampleRate());
    
.......................
    @Override
    public void setMicrosecondPosition(long microsecond) 
        framePosition = convertMicrosecondToFrames(microsecond);
    

    @Override
    public long getMicrosecondPosition() 
        return convertFramesToMicrosecond(getFramePosition());
    

    @Override
    public long getMicrosecondLength() 
        return convertFramesToMicrosecond(getFrameLength());
    
.......................

【讨论】:

以上是关于如何播放长音频剪辑?的主要内容,如果未能解决你的问题,请参考以下文章

如何从 Sonos 中删除音频剪辑

HTML5 音频:如何快速停止和重新启动剪辑?

杀死音频剪辑线程

如何使 UIButton 从 UITable 中的选择中播放相应的音频剪辑?

如何通过内置听筒扬声器播放音频

Monotouch 停止播放音频剪辑