Minicom problem with AVR?


6 posts
by MrGreg » Sun Nov 11, 2012 11:58 pm
I have followed the "Gordons guide" for the Arduino IDE and all works well except the serial Minicom?
Or perhaps it is not working as I expected?
I often get , for example with "Analog In, Out Serial"
A seemingly corrupt output/display with cpu at approx 25% ( bargraph)


ÉɁõÕÍÙ%����ÕÑÁÕÑՁõÅÍÍ5)͕����ÍͽÉɁõÕÍÙ%����ÕÑÁÕÑՁõÅÍÍ5)͕����ÍͽÉɁõÕÍÙ%��Í

Or I get a stream of correct? data, usually after a Ctrl-A X and then a restart of Minicom
Eg

sensor = 173 output = 43

as a very fast stream that sends the cpu to flat out.

So any ideas of what is causing the glitch/corruption and how to sort it?
And
Is there a way to slow up the data stream when it does work in Minicom to drop the cpu load?

I have tried a purge and re install of Minicom.

(Raspian Oct 28 , Updated)
Posts: 46
Joined: Sun Jun 10, 2012 7:25 pm
by terrycarlin » Wed Nov 14, 2012 7:14 pm
Looks like this might be caused by having the incorrect speed set.
You need to set minicom to use 9600 baud.
Start up minicom then enter ^AP (thats Control-A followed by a P
Set the baud to 9600.
If it ain't broke, take it apart and see how it works.
User avatar
Posts: 70
Joined: Thu Jun 14, 2012 10:42 pm
by Wendo » Thu Nov 15, 2012 4:43 am
I can confirm this happens even with the right speed set. I saw the same thing when testing my setup and have had it happen on occasion while doing additional programming of the AVR (talking to the ADC and DAC over the secondary SPI of the AVR)
Posts: 142
Joined: Sun Jun 10, 2012 8:27 pm
by MrGreg » Fri Nov 16, 2012 12:16 am
Yes, I also confirm that I was using 9600

I could prob slow up the data flow from the AVR, to reduce the cpu load on the Pi, but....
The main prob of corrupt data display on Minicom is a bit of a pain, currently about 50/50 odds of getting a readable result.
Thanks for the input chaps, reassuring to know it's not just me!
If anyone else is is having a similar prob, and/or has a fix, please share

Cheers
Posts: 46
Joined: Sun Jun 10, 2012 7:25 pm
by MrGreg » Mon Nov 19, 2012 10:54 pm
Well, a fudge/ workaround...

Slowing the datastream (Serial write) from the AVR to, say, 200mS, reduces the incidence of corrupt display and reduces the cpu load.
To reduce the cpu load further, if the terminal screen is is reduced in height to its minimum (80x2), then the cpu load can be reduced to circa 10% (still quite a bit?!)
Test case was Example - AnalogInOutSerial with a 200mS delay in the loop

It's a bit of a Kludge, and is treating the symptom, not the cause, but gets around the the problem for the most part for my needs at the moment

Perhaps someone with the skills could do a proper job
Posts: 46
Joined: Sun Jun 10, 2012 7:25 pm
by k4gbb » Tue Nov 20, 2012 12:57 am
Minicom is a fickled beast.
I use it to test Packet Radio Node controllers.
If you have not saved the modified defaults..
Starting minicom with -s and set the defaults.
Once you have set the defaults save them to the .dfl file.

You might also try removing (blanking) the modem initialization strings.
Using the ANSI term emulation instead of V100 also makes a difference.
The Grass may be greener on the other side of the fence, but it still has to be mowed.
User avatar
Posts: 50
Joined: Sun Aug 12, 2012 5:33 am
Location: Dunnellon, FL USA - EL88tx