Now I don't know what to do
Everything is rendered into sf::RenderTexture. Before water rendering I call:
mRenderTarget->setView(myCamera);
// Calculate polygon
sf::IntRect rect = sf::IntRect(leftTop.x, leftTop.y, rightBottom.x, rightBottom.y);
polygon.setTexture(&map);
polygon.setTextureRect(rect);
When I show up leftTop and rightBottom they are always the same (of course). Before camera moves everything is fine and texture coordinates are okay. Now I try to move camera somewhere else:
myCamera->move(-4, 0);
mRenderTarget->setView(myCamera);
Sure, now leftTop and rightBottom becomes invalid (I don't convert their coordinates refer to view perspective).
So, the first try is:
leftTop = mRenderTarget->convertCoords(sf::Vector2i(leftTop.x, leftTop.y), *myCamera);
rightBottom = mRenderTarget->convertCoords(sf::Vector2i(rightBottom.x, rightBottom.y), *myCamera);
Now they are changing when camera moves but the coordinates are incorrect and I see same troubles as in video I've uploaded before.
At the end of render cycle I do:
// mApp is sf::RenderWindow
mRenderTarget->display();
mApp->draw(sf::Sprite(mRenderTarget->getTexture()));
mApp->display();