In this assignment you will write a program that computes how far an object will have fallen in a given amount of time. For example, a group of teenagers jump off of a cliff, and know it takes 2 seconds to fall before they land in the lake below. One of the teens claim that the cliff is 150 feet high, which is a fatal jumping height. Let’s calculate how far they really fell. You will need the following equation:
d = 0.5 × at2
Variable Meaning Value
a Acceleration (m/s2) 9.81
t Time (in seconds)
d Distance traveled
Your program will ask the user for an amount of time in seconds that an object fell for. You should read in a number in double format.
Your program should output a statement containing the distance the object has fallen in that amount of time in both meters and feet. Note that 1 meter = 3.28 feet. See the sample below.
You didn't say which prog language you're using. If it's Java, you'll use a Scanner to input t, the time in seconds (a double). Then you'll apply the formula and output the result in metres and multiply by 3.28 to get feet.
So, which part do you not understand and what have you tried? This is a very basic program to make.
– a = Acceleration due to gravity is expressed in meters per second squared = 9.81 (on Earth)
– t = Time (in seconds)
– d = Distance traveled = 0.50 * a * (t*t)