Ardour and triple monitors

Does anyone use ardour with more than 2 monitors? And if so does it run fine.

Im trying to work out if its just my setup or if theres some kind of hicup somehwere.

its possibly just my machine. Im running a geforce gt610 pcie card with 1024mb ram. This powers 2 displays and runs ardour fine. Gui speed is as expected on my old machine.

But if i turn on a 3rd screen using the onboard nvidia geforce 6100 with 256mb shared ram, while i can use 3 monitors fine for other things like having mutipe browsers open, or even playing 2 different videos on 2 screens while doing stuff in a browser on the 3rd screen,

ardours gui becomes so slugish its unusable. LED meters are refreshing as low as 1 fps and resoponse from the gui takes a few seconds to response. Bring the mixer up on another display and its unusuable.

Its probaby just too much for this kind of setup.running 3 displays this way isnt ideal. However both gpu’s are using the nvidia driver but i read somewhere that its doing it in software or something.

no one have any comments?

No, only using two monitors :slight_smile: , not enough space on my desk…

Its probaby just too much for this kind of setup..

I think you’ve already found the problem.

Isn’t all the graphic stuff being re-written at the moment? Maybe it’ll get better. Or maybe a better graphics card is needed? I have a two monitor set up that struggles when I have lots of tracks plus plugins open with complicated interfaces such as CALF multiband compressor, which I think is due to my weak graphics card (needed something without a fan).

complicated interface? i wouldnt think interfaces should slow things down. If there programmed correctly they shoudl take up very little resources.

anyway im waiting for xrandr 1.4 as that supports multipe video cards and may handle it much better. or maybe not, and ill just get a 2nd gt610

@christophski, what graphcis card are yoiu using and what driver are you using. I know i had issues with gui speed in ardour due to a bug in nvidia. I had to use some settings to sort it out.

nvidia-settings -a InitialPixmapPlacement=0 -a GlyphCache=0

seemed to sort things out when i was using my onboard . InitialPixmapPlacement i think can be set between 0 and 2 so you can try changing it to 1 or 2 and see if it makes a difference.

there was also another one something about setting an eviroment variable for buggy gradients. theres a thread somewhere here about it.