Connect with us

Education

Is It Important To Keep Teaching American History In Our Schools?

Is It Important To Keep Teaching American History In Our Schools?

Here’s the Scoop

Understanding America’s past is crucial for its future. The content and method of teaching history have become focal points in educational discussions.

What do you think? Let us know by participating in our poll, or join the discussion in the comment section below!


6 Comments

6 Comments

  1. Don

    October 11, 2023 at 7:58 am

    As long as it’s the REAL American history.

  2. Donald Cook

    October 11, 2023 at 8:50 am

    don’t forget to tell how bad, life is and was in Africa, don’t forget to tell how some of the Black, carried out some of the biggest Massacres. Don’t forget how well some Blacks were treated after their dept was worked off and they became part of the Family’s that had paid the fare for them to come to America.

  3. Bill

    October 11, 2023 at 9:00 am

    It’s very important to know the true and real history of our country. As stated above by DON.
    We have people in this country today that are trying to rewrite our history to suit their
    own purposes, which are not or the good this nation.

  4. Denise

    October 11, 2023 at 9:22 am

    Absolutely it needs to be the real history – we need to teach the parts that were left out of my history books as a teen. Luckily my college professors were more accurate. It’s important to know it all, even is some of it is unpleasant for us.

  5. Dan Heartsill

    October 11, 2023 at 7:33 pm

    We need to learn from history to not make the same mistakes that were made in the past. I don’t think the present government leaders have had past history because of some of the things they have done the past couple of years.

  6. Joseph Basler

    October 13, 2023 at 1:04 pm

    If you have no idea of your History, you have no allegiance to your country.

Leave a Reply

Your email address will not be published. Required fields are marked *