Unable to understand compilers' main optimizations -


is a code where compiler calculates the multiplication by optimizing the code in run-time:

Martinus code

  int x = 0; For (Int i = 0; I <100 * 1000 * 1000 * 1000; ++ i) {x + x + x + x + x + x; } System.out.println (x); Its code after optimizing the folding-compiler continuously on  

compile-time ()

  int x = 0; For (int i = 0; i <100000000000; ++ i) {x + x + x + x + x + x; } System.out.println (x);  

This optimization technique seems to be a paltry in my opinion. I think it might be one of the techniques that the Sun has recently started to despise.

I am interested in the two types of optimization made by the compiler :

  1. Optimizations that have been dropped in today's compilers, such as Run-time in the compiler of Java
  2. Optimizations that are used with the majority of today's compilers

Please, give each other a different answer to the optimization technique.

Which technology is used in the 90's (1) and today (2)?

Compiler books should provide a very good resource.

If it is clear, please ignore it, but you are asking about low -live optimization, only those compilers can do


Comments

Popular posts from this blog

c++ - Linux and clipboard -

Visual Studio 2005: How to speed up builds when a VSMDI is open? -

booting ubuntu from usb using virtualbox -