Increasing spider monkey message buffer

I have the bytes of a image i’m trying to broadcast to my clients from a server, i did some test and the biggest message i could send was one that has a byte size of 32759. the way im thinking about going about this would send 38 parts for 1 image then i would have to re assemble them together on the client. i would like to be able to send less parts or the whole image in 1 message.

[java]

/*

  • To change this template, choose Tools | Templates
  • and open the template in the editor.

    */

    package com.benkyou.webcam;

    import com.benkyou.common.WebcamMessage;

    import com.benkyou.server.WebcamServer;

    import com.jme3.network.serializing.Serializer;

    import com.jme3.texture.Image.Format;

    import java.awt.image.BufferedImage;

    import java.io.IOException;

    import java.net.MalformedURLException;

    import java.net.URL;

    import java.nio.ByteBuffer;

    import javax.imageio.ImageIO;

    import jme3tools.converters.ImageToAwt;





    /**

    *
  • @author Student

    /

    public class JPGStreamReader implements Runnable{

    private String imgLocation;

    private URL imgURL;

    private BufferedImage awtImage;

    private ByteBuffer byteBuffer;

    final public Format jmeformat = Format.RGBA8;

    public static byte[] imageBytes;

    private ByteBuffer copiedBuffer;

    private WebcamServer webcamServer;

    public JPGStreamReader(String imgLocation, WebcamServer webcamServer){

    Serializer.registerClass(WebcamMessage.class);

    imageBytes = null;

    this.webcamServer = webcamServer;

    this.imgLocation = imgLocation;

    byteBuffer = ByteBuffer.allocateDirect((640
    4804));

    try{

    imgURL = new URL(imgLocation);

    }

    catch(MalformedURLException e){};

    }

    public String getImageLocatio(){

    return imgLocation;

    }

    ByteBuffer readBuffer = ByteBuffer.allocateDirect((640
    480*4));

    public void run() {

    while(!Thread.currentThread().isInterrupted()){



    try{



    byteBuffer.clear();

    awtImage = ImageIO.read(imgURL);

    ImageToAwt.convert(awtImage, jmeformat,byteBuffer);

    copiedBuffer = byteBuffer.duplicate();

    copiedBuffer.clear();

    copiedBuffer.position(0);

    imageBytes = new byte[copiedBuffer.remaining()];

    copiedBuffer.get(imageBytes);



    int innerCount = 0;

    int overflow = 32759;

    byte temp[] = new byte[overflow];

    System.out.println("Image bytes size :" + imageBytes.length);

    int numberOfParts = 0;

    for(int i = 0 ; i < imageBytes.length;i++){

    temp[innerCount] = imageBytes;

    if( i % overflow == 1){

    ++numberOfParts;

    webcamServer.getServer().broadcast(new WebcamMessage(temp.clone()));

    innerCount = 0;

    }



    }

    System.out.println("Full message sent… Number of parts = " + numberOfParts);





    }catch(Exception e){

    e.printStackTrace();

    }

    }





    }

    public String getUrl() {

    return imgLocation;

    }

    public static void main(String args[]){

    }

    }

    [/java]



    and the output i get is :

    [java]

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    Image bytes size :1228800

    Full message sent… Number of parts = 38

    [/java]

    if i increase the image byte size anymore i get a buffer overflow exception.



    Thank you in advance for your input

    -Allman

Because of the way the serializer is architected, we can’t really increase the buffer size as that would impact even small messages. It’s a deficiency of the ByteBuffer design.



You will be happier splitting it up in the long run anyway, I think. Otherwise, your channel is tied up with no other messages being able to go through. At least this way, other messages can still squeeze in.



Also, you could compress the image which would save a lot of messages.

Thanks for the speedy reply pspeed :slight_smile:

New to all this server business.

I planned on having this webcam stream reading server running on its own server separate from the game server and having the game client connect to both servers, will this bog down the game server channels on the client?

Thanks

-Allman

No. They will operate independently.



But on the other hand, there is only so much bandwidth to go around. Trying to shove uncompressed 1.2 meg images over TCP is going to really saturate your pipe, I think. How often are you trying to send them?

i’m hopping to get a good frame rate of 25 fps, but since each frame is 37 messages that’s about 925 messages a second. The image is from a webcam so i think it is decently pre-compressed for me… pulling a single image of the webcam is only 24kb… I think my bottleneck is how i’m reading the byte array from the byte buffer, i would suspect that it should be various sizes with a stream, but its not.

I was hoping to do the image conversion server side so that the image i send out to the clients is pre converted, tho it is starting to look like it would be more efficient to just foward the compressed image from the webcam to the clients and converting it over there.

Yeah, I was just going by:

ByteBuffer readBuffer = ByteBuffer.allocateDirect((6404804));



That’s a huge amount of data to be sending in one chunk… and is definitely uncompressed in that form. It’s especially bad when you get 24kb per frame originally… which will easily fit in one SM message. Are you streaming from the camera or capturing single frames?



In fact, depending on what you are doing with them, if you are just capturing and sending single frames then you might be able to get away with sending each frame “unreliable” which will be faster but you might miss frames. Especially if they are large (UDP reliability goes down if packets are bigger than the MTU of the pipe.)



If you still want to get the raw 640x480 frames then you could do some basic compression on them before sending. Something would be better than nothing.

Its a single frame generated from the webcam server that i grab and do all my conversions on.

i found a tid bit on how to read a motion jpg stream and i “think” im able to pull off a full image tho i haven’t been very succesfull in converting the data back to a image…

[java]

package com.benkyou.webcam;



import com.jme3.texture.Image;

import com.jme3.texture.Image.Format;

import java.io.BufferedReader;

import java.io.ByteArrayOutputStream;

import java.io.IOException;

import java.io.InputStreamReader;

import java.net.URL;

import java.nio.ByteBuffer;

import java.util.Iterator;





public class Test {

private ByteBuffer convertByteArrayBuff;

ByteArrayOutputStream buffer = new ByteArrayOutputStream();

final public Format jmeformat = Format.RGBA8;



/**

  • @param args

    */

    public static void main(String[] args) {

    Test mp = new Test(“some webcam url”);

    }



    int count = 0;

    public Test(String mjpeg_url)

    {

    convertByteArrayBuff = ByteBuffer.allocateDirect(5000);

    int imageCount = 0;



    try {





    URL url = new URL(mjpeg_url);

    BufferedReader in = new BufferedReader(new InputStreamReader(url.openStream()));

    String inputLine;

    int lineCount = 0;

    boolean lineCountStart = false;

    boolean saveImage = false;

    ByteArrayOutputStream byteOutputStream = new ByteArrayOutputStream();



    while ((inputLine = in.readLine()) != null) {

    // Should be checking just for “–” probably



    if (inputLine.lastIndexOf("–myboundary") > -1)

    {



    buffer.flush();

    convertByteArrayBuff.clear();

    convertByteArrayBuff.wrap(buffer.toByteArray());

    byte[] temp = buffer.toByteArray().clone();

    System.out.println(temp.length + “Array Lenght”);

    //for(int i = 0; i < temp.length; i ++){

    // System.out.print(temp);

    // }

    buffer.reset();



    // Got an image boundary, stop last image

    // Start counting lines to get past:

    // Content-Type: image/jpeg

    // Content-Length: 22517



    saveImage = false;

    lineCountStart = true;



    System.out.println(“Got a new boundary”);

    System.out.println(inputLine);

    }

    else if (lineCountStart)

    {

    lineCount++;

    if (lineCount >= 2)

    {

    lineCount = 0;

    lineCountStart = false;

    imageCount++;

    saveImage = true;

    System.out.println(“Starting a new image”);



    }

    }

    else if (saveImage)

    {

    byte[] temp = inputLine.getBytes(“UTF-8”);

    buffer.write(temp);

    }

    else {



    System.out.println(“What’s this:”);

    System.out.println(inputLine);

    }

    }

    in.close();

    } catch (IOException e) {

    e.printStackTrace();

    }



    }

    }

    [/java]

[java]

–myboundary

Starting a new image

48111Array Lenght

Got a new boundary

–myboundary

Starting a new image

48184Array Lenght

Got a new boundary

–myboundary

Starting a new image

48054Array Lenght

Got a new boundary

–myboundary

Starting a new image

48056Array Lenght

Got a new boundary

–myboundary

Starting a new image

48054Array Lenght

Got a new boundary[/java]