Hm ... I am using Raspbian with manual X startup via "startx". Here, it works.sam_p_lay wrote:Well the default terminal in Enlightenment 17 (Bodhi's WM) is called Terminology. Nothing happened when I rebooted though.
No, global default behavior is usually specified in "/etc/X11/xinit/xinitrc". This includes window manager startup. If you create "~/.xinitrc", this should override the global default.And wouldn't this still boot into Enlightenment and just fire up a terminal?
As the names suggest: ".bashrc" is executed whenever a bash shell is started. ".xinitrc" is executed during X startup.What's the difference by the way between .bashrc and .xinitrc? They both seem to be like DOS's autoexec.bat but run when your GUI loads?
C/C++ are translated to native machine code, whereas Java is translated to bytecode which has to be interpreted.I didn't realise C/C++ are faster than Java.
ObjC and C++ are both object-oriented extensions of C. They use different object-orientation philosophies.I was equally up for learning either but Apple's C ('Objective-C') is some kind of Apple implementation of C?
C/C++ are not affiliated with Apple.I figured Java will be a more transferable skill to have. Plus I don't really want to get involved with Apple or make them any money
X is still quite good in the use cases it was originally intended for.And I had heard about X being a bit of a dinosaur now - read it's being replaced by a display server called Wayland? I didn't know what the shortcomings of X were, but performance is always a good reason
I am using Chromium with 1280x720 resolution and all the auto completion/prefetch stuff turned off. Overclocking is set to 900 Mhz. It gives me sufficient performance on simple websites without advertisments. But AJAX stuff and sites overloaded with advertisments are still no fun.ghans wrote:Midori isn't very compatible , Netsurf is supposed to be very fast
and can work without X and only with the Framebuffer.
Machine code = lowest programmable level of a CPU. 0's and 1'ssam_p_lay wrote:Thanks, this looks interesting! Wonder if Midori would work with it? I don't know much with the command line either, just basic navigation and file management. Really need to get my head around <, >, | and how to use grep effectively. Doing everything from the command line I think would certainly be a good way to learn, long as I can multitask.
Even if I don't get a solution out of this, I'm still glad I posted - it's been really educational.
On the most primitive level, every digital device can only process 0s and 1s. Thus, bytecode is saved on your SD card as 0s and 1s as well. The difference is that machine code can be executed directly by the CPU, because it is written in the "native language of the CPU". Bytecode cannot be directly executed by the CPU. It is easier to translate into machine code than Java source code, but it still has to be translated on-the-fly during execution.sam_p_lay wrote:what's the difference between bytecode and machine code? Machine code is 0s and 1s right?
IMO C++ is not necessarily an extra challenge compared with Java.I'd be making Apple money through app sales. That's not a huge deal to be honest, more the fact that Objective-C would be a much less transferable skill than C++ or Java. I'd have nothing against taking on a bit of extra challenge with C++, but if Android dev work is with Java then that's probably my best bet.
Code: Select all
sudo apt-get install twm
Code: Select all
twm & lxterminal
Although it is a common phrase, I do not find it helpful to define machine code as consisting of 0s and 1s, because source code and bytecode are represented as sequences of 0s and 1s, too. 0 and 1 are the (only) characters of the digital alphabet. Thus, everything must be represented as sequences of 0s and 1s on digital devices.jamesh wrote:Machine code = lowest programmable level of a CPU. 0's and 1's
I would not say that C is more popular than C++ or Java. It depends on the context. The history of the C programming language is strongly connected with the history of the UNIX operating system. Even today, almost all UNIX kernels are written in C. Thus, you can be pretty sure, that every UNIX derivate comes with a powerful C compiler. For this reason, C is still the widely spreadest language in the UNIX world.sam_p_lay wrote:C sounds like a better way to do things then performance-wise, is that the reason it's more popular?
I used the 1 and 0's because that's what the OP referred to.MAA1612 wrote:Although it is a common phrase, I do not find it helpful to define machine code as consisting of 0s and 1s, because source code and bytecode are represented as sequences of 0s and 1s, too. 0 and 1 are the (only) characters of the digital alphabet. Thus, everything must be represented as sequences of 0s and 1s on digital devices.jamesh wrote:Machine code = lowest programmable level of a CPU. 0's and 1's
Machine code can be executed directly by the CPU, because it is the native language of the CPU. Higher language source code and bytecode have to be translated before or during execution, because they are not the native language of the CPU.
Assembly language can be compared to a transliteration of chinese words with latin characters, i.e. it uses the same words as machine code, but written differently, so they are easier to read and write for us.
I'd recommend reading the Wikipedia article on standard streams.sam_p_lay wrote:I tried to output the error using startx > error.txt but just got an empty file. Guess that doesn't work for everything? Or I did it wrong?