How Did Wwii Change America and the World?

Name
Institution
Course
Instructor
July 15, 2010
How did WWII change America and the World?
      The World War II remains the bloodiest and deadliest crisis in world history. It involved the best ever battle lines, enormous armed forces, along with the most destructive weapons in the history of the world (Martel 245).   The worldwide disagreement that was regarded as or labelled World War II emanated from the 1930s great depression, a turmoil that undermined, destabilized and weakened economies, governments’ and nations around the world. For instance, in Germany, the rise of Hitler occurred partially because he alleged to be capable of transforming a destabilized Germany by then to an autonomous economic and military power that could control its destiny not only in Europe, but in the world as well (Martel 240). This paper therefore explores how this global conflict (WWII) changed America and the entire world.
      For US, World War II constituted the most significant fiscal event of the 20th century (Higgs 53). The war’s consequences on the United States were diverse and far-reaching. It indomitably ended the great depression, and the US federal government came out from the war as a possible economic actor, capable of regulating fiscal activity and to partly control the world economy via spending and consumption. What’s more, the wartime economic explosion spurred and promoted a number of social trends. First was the job creation that paralleled the growth of industrial production, in so doing, attained what Roosevelt’s New Deal programs had been incapable to achieve (Rindfuss, Morgan & Swicegood 86). According to Higgs (58), in 1940, there were more than eight million unemployed Americans. However, by 1941, unemployment became a thing of the past. There were in fact labour shortages in numerous industries. Consequently, federal inspectors did ignore laws deterring employment of women and children (Martel 250). With little or no public outcry, several high...