After the "war to end all wars" was over and peace settlements were signed, the United States looked forward to prosperity and peace. The 1920s were characterized by new freedoms for women, a stock market that made investors rich, and a culture characterized by the flauting of excesses. America never joined the League of Nations, and pursued an isolationist foreign policy. However, after 1940 this changed with the fall of France and the United States began rearming itself. The Japanese attack on Pearl Harbor drew the United States into World War II, and the world was changed forever.