Skip to main content

Orders of magnitude (time)

 

Orders of magnitude (time)

10−12picosecondpsOne trillionth of one second1 ps: mean lifetime of a bottom quark; light travels 0.3 millimeters (mm)
1 ps: typical lifetime of a transition state
4 ps: Time to execute one machine cycle by an IBM silicon-germanium transistor
109 ps: Period of the photon corresponding to the hyperfine transition of the ground state of cesium-133, and one 9,192,631,770th of one second by definition
114.6 ps: Time for the fastest overclocked processor As of 2014 to execute one machine cycle.[9]
10−9nanosecondnsOne billionth of one second1 ns: Time to execute one machine cycle by a 1 GHz microprocessor
1 ns: Light travels 30 cm (12 in)
10−6microsecondµsOne millionth of one second1 µs: Time to execute one machine cycle by an Intel 80186 microprocessor
2.2 µs: Lifetime of a muon
4–16 µs: Time to execute one machine cycle by a 1960s minicomputer
10−3millisecondmsOne thousandth of one second1 ms: time for a neuron in human brain to fire one impulse and return to rest[10]
4–8 ms: typical seek time for a computer hard disk
10−21 centisecondcsOne hundredth of one second1–2 cs (=0.01–0.02 s): Human reflex response to visual stimuli
1.6667 cs period of a frame at a frame rate of 60 Hz.
2 cs: cycle time for European 50 Hz AC electricity
10−11 deciseconddsOne tenth of a second1–4 ds (=0.1–0.4 s): Blink of an eye[1
Multiples                 Unit            Symbol
6×101 seconds   1 minute            m
6×101 minutes   1 hourh (hr)      h
2.4×101 hours    1 day                 d






Popular posts from this blog

Bagging and Boosting

  What is an Ensemble Method? The ensemble is a method used in the machine learning algorithm. In this method, multiple models or ‘weak learners’ are trained to rectify the same problem and integrated to gain desired results. Weak models combined rightly give accurate models. Bagging Bagging is an acronym for ‘Bootstrap Aggregation’ and is used to decrease the variance in the prediction model. Bagging is a parallel method that fits different, considered learners independently from each other, making it possible to train them simultaneously. Bagging generates additional data for training from the dataset. This is achieved by random sampling with replacement from the original dataset. Sampling with replacement may repeat some observations in each new training data set. Every element in Bagging is equally probable for appearing in a new dataset.  These multi datasets are used to train multiple models in parallel. The average of all the predictions from different ensemble models i...