Welcome, Guest. Please login or register. Did you miss your activation email?

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Topics - Marukyu

Pages: [1]
1
Audio / sf::Music fails to loop after seeking
« on: April 19, 2019, 07:55:46 pm »
When seeking an sf::Music object to certain playback offsets, it will not loop at the end despite setLoop having been called on it.

The following example reproduces this issue on Linux Mint 18.3 (the example requires testsound.ogg, found in the attachments, to be present in the working directory, though any stereo Ogg sound should work):
#include <SFML/Audio/Music.hpp>
#include <SFML/System/Sleep.hpp>
#include <SFML/System/Time.hpp>
#include <iostream>
#include <fstream>
#include <memory>
#include <string>
#include <vector>

std::vector<char> readFile(std::string fileName)
{
        std::ifstream f(fileName);
        std::vector<char> data;
        f.seekg(0, std::ios::end);
        data.resize(f.tellg());
        f.seekg(0, std::ios::beg);
        f.read(data.data(), data.size());
        return data;
}

std::unique_ptr<sf::Music> play(const std::vector<char> & buffer, sf::Time offset)
{
        std::unique_ptr<sf::Music> music = std::make_unique<sf::Music>();
        music->openFromMemory(buffer.data(), buffer.size());

        music->setLoop(true);
        music->setVolume(20);
        music->setPitch(1);
        music->play();

        if (offset != sf::Time::Zero)
        {
                music->setPlayingOffset(offset);
                std::cout << "Playing at " << offset.asMicroseconds() << " us" << std::endl;
        }
        else
        {
                std::cout << "Playing from start" << std::endl;
        }

        return std::move(music);
}

int main()
{
        auto buffer = readFile("testsound.ogg");

        std::unique_ptr<sf::Music> music;

        music = play(buffer, sf::Time::Zero);       // Playing from the start: sound loops correctly
        sf::sleep(sf::seconds(5));
        music = play(buffer, sf::microseconds(20)); // Issue occurs here: sound does not loop
        sf::sleep(sf::seconds(5));
        music = play(buffer, sf::microseconds(25)); // No issue here, sound loops properly
        sf::sleep(sf::seconds(5));

        return 0;
}
 

Playing the example sound from the start or seeking to 25 microseconds loops the sound just fine, but seeking to 20 microseconds causes the sound to stop at the end instead.

Through debugging, I found that in the erroneous case, when reaching the end of the sound, SoundStream::fillAndPushBuffer exceeds the retry limit and requests a stop.

This is due to neither of the two conditions in Music::onLoop being met: the current sample position is just one sample short of the music's ending, but no more full multi-channel samples can be read, therefore never allowing the loop condition to trigger.

This playback offset misalignment was caused by InputSoundFile::seek(Uint64), which accepts offsets that end up between full multi-channel samples. I fixed this issue locally by dividing/multiplying the offset by the channel count, changing InputSoundFile::seek(Uint64)'s function body to the following:
void InputSoundFile::seek(Uint64 sampleOffset)
{
    if (m_reader && m_channelCount != 0)
    {
        // The reader handles an overrun gracefully, but we
        // pre-check to keep our known position consistent
        m_sampleOffset = std::min(sampleOffset / m_channelCount * m_channelCount, m_sampleCount);
        m_reader->seek(m_sampleOffset);
    }
}

I haven't familiarized myself too deeply with SFML's sound streaming classes and audio streaming in general, so I might be completely mistaken about most or all of what I've written above. It could also be a completely unrelated issue instead, as the phenomenon has proven rather difficult to reproduce reliably. :)

2
Feature requests / Public extract() function for sf::Packet
« on: April 26, 2012, 12:54:49 pm »
Hello,

I have recently switched from TCP/UDP mixed to UDP only networking code. The protocol I am designing for my project will have to be as optimized as possible, as ~20 packets per second will be transferred between server and client. I use strings quite frequently in the protocol, but when I checked out the way SFML handles strings, I saw that it actually stores 4 bytes for the string length in the packet, despite UDP having a maximum packet size of slightly less than Uint16's max value. Because I send quite a few strings per packet, ranging from 2 to around 10, the difference of bytes per second compared to if SFML would use 2-byte length markers for strings does add up to quite a bit (few hundred bytes per second), and when doing networking, memory optimization is highly important.

I thought I would just write a "ShortString" class that works just like a string, but has its max length capped at UDP's maximum packet size, and when appended to a packet using the bitshift operator overloads, the packet would just use sf::Uint16 for length rather than sf::Uint32. Appending the string works using the public append() function, but I can't find a way to extract the string efficiently. The internal method for extracting a string relies on a private class variable, so the only way to do it would be extracting the string char by char, which is obviously unoptimized and slow. So, I would like to propose a public function for sf::Packet that takes a length (the read position would be moved by that amount) and returns a const void * or const char * to "getData() + m_readPos" before the read position was moved. The function could be called "const char * sf::Packet::extract(std::size_t lengthInBytes)". This would effectively give the developers more control about extracting custom data types from packets, as the append function already makes it possible to insert any kind of data.

tl;dr: something like this:
const char * sf::Packet::extract(std::size_t lengthInBytes)
{
    if (!checkSize(lengthInBytes))
        return NULL;

    const char * ret = getData() + m_readPos;
    m_readPos += lengthInBytes;
    return ret;
}

3
Hello,

first off, my apologies for bothering with pretty much the same issue as last time again. I have been trying to compile my project for Windows, and everything is working flawlessly except for the client's TCP socket. Using a similar code setup as described in the previous thread about a related uncertainity, upon trying to connect, the non-Blocking TcpSocket first returns "NotReady" (as expected), but any subsequent calls to connect() result in "Error" being returned. The server can see the client connecting, but as soon as "connect()" is called client-side a second time, it seems to disconnect again. The following code example that resembles the one linked to before reproduces the issue for me: connecting to a valid server with a non-blocking TCP socket prints "Socket not ready" once and then enters an infinite loop of "Socket error":

Code: [Select]
int main()
{
    sf::TcpSocket netTCP;
    netTCP.setBlocking(false);
    while(true)
    {
        switch (netTCP.connect("localhost", 8989, sf::Time::Zero))
        {
        case sf::Socket::Done:
            std::cout << "Connection successful" << std::endl;
            return 0;    // success.
        case sf::Socket::Disconnected:
            std::cout << "Connection refused" << std::endl;
            break;
        case sf::Socket::Error:
            std::cout << "Socket error" << std::endl;
            break;
        case sf::Socket::NotReady:
            std::cout << "Socket not ready" << std::endl;
            break;
        default:
            std::cout << "Unknown Error" << std::endl;
            break;
        }
    }
}

Firewall exceptions and stuff are set up correctly, and the server can see the client connecting for a short moment, so it is probably a client-side problem. The same code works flawlessly on Linux.

Any help or information on what I might doing wrong here would be greatly appreciated.

4
Network / How to deal with non-blocking TcpSocket returning NotReady?
« on: February 24, 2012, 07:04:29 pm »
Hello,

right now, I am writing the networking code for my project and trying to figure out how to work with a non-blocking TCP socket for the client. Whenever I try to connect to my debug server, TcpSocket::Connect() returns Socket::NotReady, but the server can already see the connection and both server and client can send/receive packets. If I call TcpSocket::Connect() again, it returns Socket::Done. The different return values for the different socket functions don't seem to be too well documented, and checking the Unix implementation source code didn't help me much either since there seem to be various causes for Socket::NotReady.

I am using Crunchbang Linux and a quite recent SFML 2.0 (shortly after the time API change was committed), here is some example code that should reproduce the issue, provided it can connect to any TCP server running at the specified IP and port.

Code: [Select]
int main()
{
    sf::TcpSocket NetTCP;
    NetTCP.SetBlocking(false);
    switch (NetTCP.Connect("localhost", 8989, sf::Time::Zero))
    {
    case sf::Socket::Done:
        std::cout << "Connection successful" << std::endl;
        break;
    case sf::Socket::Disconnected:
        std::cout << "Connection refused" << std::endl;
        break;
    case sf::Socket::Error:
        std::cout << "Socket error" << std::endl;
        break;
    case sf::Socket::NotReady:
        std::cout << "Socket not ready" << std::endl;
        break;
    default:
        std::cout << "Unknown Error" << std::endl;
        break;
    }
}


Commenting out the "SetBlocking(false)" line prints "Connection successful". Changing the timeout does not seem to have any effect.
Should I just ignore the return value of Connect() if it's NotReady? If this is just a way for the socket to say, "I am connecting, wait until I am ready", I would like to know how to poll the socket status during that time, since sockets don't appear to have any kind of "IsReady()" function.
I tried searching the forums, and found someone who had a similar issue, but apparently it was related to the fact that they were using a SocketSelector, while this client-side code is only for a single TCP socket.

As always, any help or support is greatly appreciated. ^^

5
Graphics / Antialiasing via ContextSettings not working on Linux
« on: January 03, 2012, 10:01:14 pm »
Hi,

the project I am working on uses vertex arrays and shapes a lot, but they aren't drawn with antialiasing, which makes them look quite ugly when they aren't made of axis-aligned lines. When I set ContextSettings' AntialisingLevel to 2 or 4 and create a RenderWindow with that, GetSettings().AntialisingLevel returns 0. I'm using CrunchBang Linux 10 and a rather recent version of SFML2. glxinfo and experience with various games on Linux tell me that my NVIDIA Linux drivers support antialiasing/multisampling up to level 4.
I have found something that might be related to the problem I'm having, but the thread is already over half a year old and the issue didn't seem to be solved yet: http://www.sfml-dev.org/forum/viewtopic.php?t=4462
Laurent described the situation as complicated, but the ability to enforce antialiasing through some function would be nice to have, though I have no idea how GLX, X11, Contexts and stuff like this work as I never worked with anything lying below SFML, so I might be underestimating the complexity in implementating something like this.

Any kind of help with or solution to this problem would be greatly appreciated.

6
Graphics / [SFML2] RenderTexture messes with Texture it should draw
« on: September 09, 2011, 07:27:29 pm »
Hello,

while working on my game I ran into a really strange issue: I'm using a standard map container to dynamically store 80x80 large blocks of the level. Another map container contains a texture-sprite pair which is pre-rendered when the associated block is loaded.
This has worked fine so far, but now I'm having problems with lighting. Light is calculated and drawn when a block loads: a 'global' RenderTexture is cleared, the lighting sprites are drawn onto it, and then it is copied into a Texture in a Sprite/Texture std-map. However, the RenderTexture seems to modify the Texture the lighting sprites use: they are turned into garbage, or parts of the actual level, but most of them do not display at all and the RenderTexture remains black. I wrote a minimal example program that reproduces the error, it uses the same method that I use in my game, just simplified:

Code: [Select]
#include <SFML/Graphics.hpp>
#include <map>
#include <utility>

int main() {
    sf::RenderWindow App(sf::VideoMode(800, 600), "SFML window");

    sf::Texture Tex;
    if (!Tex.LoadFromFile("test.png"))     // this can be any image file smaller than 100x100.
        return EXIT_FAILURE;
    sf::Sprite TestSpr(Tex);

    sf::RenderTexture RTex;
    RTex.Create(100,100);

    std::map<std::pair<int,int>,std::pair<sf::Texture,sf::Sprite> > DrawMap;   // 2d-matrix of drawn sprites.
    App.SetFramerateLimit(70);

    for (int x=0; x<8; x++)
        for (int y=0; y<6; y++) {
            const std::pair<int,int> cur = std::make_pair(x,y);     // current entry in map.
            // Tex.LoadFromFile("test.png");
            RTex.Clear();           // clear rendertexture.
            RTex.Draw(TestSpr);     // draw test sprite.
            RTex.Display();         // update texture.
            DrawMap[cur].first.LoadFromImage(RTex.GetTexture().CopyToImage()); // copy rendertexture to texture in drawmap.
            DrawMap[cur].second.SetTexture(DrawMap[cur].first);     // define sprite.
            DrawMap[cur].second.SetPosition(x*100,y*100);           // position sprite.
        }

    while (App.IsOpened()) {
        sf::Event Event;
        while (App.PollEvent(Event)) {
            if (Event.Type == sf::Event::Closed) App.Close();
        }

        App.Clear();
        for (int x=0; x<8; x++)
            for (int y=0; y<6; y++)
                App.Draw(DrawMap[std::make_pair(x,y)].second);      // render map.
        App.Display();
    }
    return EXIT_SUCCESS;
}


Change "test.png" to any image smaller than 100x100 px. The result I am getting is that the texture gets smaller and smaller every time it is drawn. However, if you uncomment the line "Tex.LoadFromFile("test.png");", the image is drawn correctly every time. For my experiments, I used an 80x80 PNG image with an alpha channel, but it seems to happen with most files.

I am using CrunchBang Linux, the latest version of SFML2 (updated an hour ago), Code::Blocks w/ GCC 4.6, my video card is NVIDIA GeForce GTX 560Ti with driver version 280.13.

Any help with this strange issue would be greatly appreciated.

Pages: [1]
anything