Two (and a half) reasons:Burngate wrote:Isn't that why an R-2R DAC is to be prefered? And if so, why wasn't it chosen?
The banding I think is because of the GPIO output impedance; in my testing it can be reduced somewhat by selecting 16mA drive (after boot, since the overlay can't). Tabulating all possible values, discontinuities can be seen assuming 32R output impedence. Reducing the resistor values by 16R with 16mA selected pretty much gets rid of it. But 888 isn't the main feature of my VGA board, rather ESD protection, buffering, and noise immunity - and CE Mark.Gert van Loo wrote:With the 6-bits per channel, the colours in the current design already show 'banding' (In some places the colour jumps are clearly visible).james-at-lo-tech wrote:I've released a 24-bit VGA board today - see here.
The R–2R ladder operates as a string of current dividers, whose output accuracy is solely dependent on how well each resistor is matched to the others. Small inaccuracies in the MSB resistors can entirely overwhelm the contribution of the LSB resistors. This may result in non-monotonic behavior at major crossings, such as from 01111 to 10000.
Gert van Loo wrote:Wikipedia to the rescue (Yes, I am too lazy to calculate it myself).The R–2R ladder operates as a string of current dividers, whose output accuracy is solely dependent on how well each resistor is matched to the others. Small inaccuracies in the MSB resistors can entirely overwhelm the contribution of the LSB resistors. This may result in non-monotonic behavior at major crossings, such as from 01111 to 10000.
I have no idea, honest. I just gave the hardware to Dom and he did something in software.I would also like a dual display via vga and hdmi.
So do you mean that if i use vga666 that theres no possible way to make hdmi work?even if i dont use them simultanously?Gert van Loo wrote:I have no idea, honest. I just gave the hardware to Dom and he did something in software.I would also like a dual display via vga and hdmi.
But then I just typed 'dual display' in the search box and apart from this entry
another hit came up: viewtopic.php?f=28&t=154067&p=1101328&h ... y#p1025113
Im also having this problem. Have you able to solve it? i only can use vga not hdmi .PiGraham wrote:Any news on software / firmware to support dual display (HDMI + VGA)?
hope this helps...Dom wrote:Currently the linux framebuffer (console / X) can only run on a single display and can't be switched without a reboot. The display used can be configured from config.txt.
The other display can only be driven through dispmanx. omxplayer is one example that can output to either display.
Some games and emulators use dispmanx for better performance so they would be possible.
But in general dual display use is very limited.
Proper linux dual display support will only come with the experimental arm side graphics driver.
In theory once https://github.com/raspberrypi/linux/pull/1813 is merged then the official display will be supported as a standard linux/X framebuffer,
and in theory whatever X multi-monitor support that exists will work with LCD display and HDMI.
Currently I've not got the PR working with the official display, but presumably Eric has and we'll work out the magic soon.
tvjon wrote:If anyone using Gert's adapter would like to use both hdmi & vga simultaneously, I've found this to work ok:
My relevant settings in
config.sys
enable_dpi_lcd=1
display_default_lcd=0
Open terminal (visible screen being hdmi, as display default for DPI is set to 0 above)
wow thanks for this info!
i was wondering if i could auto run a python code which generate an image via omxplayer too?i need to be able to show my image during boot up so it will automatically load my py code.ive been able to do tha via hdmi and vga666 separately but i wanted to be able to show my image without editing my config txt everytime i switches uotputs
thnks
john
[email protected] ~ $ omxplayer --display 4 pi2-clkpwm.webm.mp4
Obviously choose your own video file.
Below is terminal output for the poor quality attached jpeg.
[email protected] ~ $ omxplayer --display 4 pi2-clkpwm.webm.mp4
Invalid framerate 1000, using forced 25fps and just trust timestamps
Video codec omx-vp8 width 1280 height 800 profile -99 fps 25.000000
Subtitle count: 0, state: off, index: 1, delay: 0
V:PortSettingsChanged: [email protected] interlace:0 deinterlace:0 anaglyph:0 par:1.00 layer:0
V:PortSettingsChanged: [email protected] interlace:0 deinterlace:0 anaglyph:0 par:1.00 display:4 layer:0
have a nice day
I actually have the DPI setting for 1024*768, but have since tried another old vga monitor, capable of 1280*1024 & it seems happy to play high res' files.
VGA666 is currently displaying a BBC HD recording & cpu usage hasn't risen beyond 10%.
All models of Pi (other than the early ones with 26 pins) are fully capable of driving the VGA666 adaptor without any effort. It is all done in hardware. I've used it on a Pi Zero and a B+ (same speed as A+) with excellent results.CostasVav wrote: Is the A+ too slow to drive the VGA signal?
The cheap adapters appearing from the hobby PoC board typically have little (if any) attention in their implementation to EMC either in terms of radiated emissions or susceptibility to radiated emissions. Also they provide no ESD protection.CostasVav wrote:So I plugged the Gert666 into a new RPi 3B and the image quality was much better than the RPi A+. Nothing else was changed. It still wasn't good enough for me (still a slight noise level), so I switched to an HDMI-to-VGA adapter instead for best quality.
There must be something about the A+ giving off a little more interference than a 3B.
james-at-lo-tech wrote:The cheap adapters appearing from the hobby PoC board typically have little (if any) attention in their implementation to EMC either in terms of radiated emissions or susceptibility to radiated emissions. Also they provide no ESD protection.CostasVav wrote:So I plugged the Gert666 into a new RPi 3B and the image quality was much better than the RPi A+. Nothing else was changed. It still wasn't good enough for me (still a slight noise level), so I switched to an HDMI-to-VGA adapter instead for best quality.
There must be something about the A+ giving off a little more interference than a 3B.