DataBuffer resizing

About Monkey 2 Forums Monkey 2 Programming Help DataBuffer resizing

This topic contains 5 replies, has 3 voices, and was last updated by  Hezkore 1 year, 7 months ago.

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #10617

    Hezkore
    Participant

    I’m trying to resize a DataBuffer, but it doesn’t seem to update properly?
    In this example I’m resizing it to 8, but Length still returns 512

    #10621

    TomToad
    Participant

    Here is the code in the source.

    As you can see, the buffer is being resized properly. The problem is that the _length field is not being updated, so it is reporting the wrong size (and could cause an error if you read or write outside the true length).
    Should be a _length = length in there somewhere.

    Edit: just submitted an issue on Github.

    #10633

    Hezkore
    Participant

    Thanks for the issue submit to Github.

    I have loads of issues with DataBuffer.
    For example, when using ‘server.ReceiveFrom’ with network sockets (seen in ‘banans/echoserver_udp’)

    I get this error:

    Though I guess that could be a network socket issue and not a DataBuffer issue.
    But it seems like whenever I use the DataBuffer, I run into some weird problem heh.

    #10636

    TomToad
    Participant

    Maybe possible that more data is received between lines 2 and 3 causing server.ReceiveFrom to read in more data than the buffer was originally allocated?
    Maybe try something like this

    #10640

    Mark Sibly
    Keymaster
    Danger! Use ‘buffer.Data’ to get a pointer to the data within a buffer, NOT ‘Varptr buffer’. Varptr buffer returns the address of the buffer object. The above code will completely destroy the contents of the ‘buffer’ reference.
    ‘Varptr object’ should really be illegal, and is likely to become so one day.
    #10661

    Hezkore
    Participant

    Okay good to know Mark!
    Resize issue is still a problem though :/

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.