Suppose the maximum safe average intensity of microwaves for human exposure is taken to be 1.00 W/m2. If a radar unit leaks 10.0 W of microwaves (other than those sent by its antenna) uniformly in all directions, how far away must you be to be exposed to an average intensity considered to be safe? Assume that the power spreads uniformly over the area of a sphere with no complications from absorption or reflection.

Respuesta :

Answer:

at r= 0.56 m far away must you be to be exposed to an average intensity considered to be safe

Explanation:

The maximum safe average intensity of microwaves for human exposure is taken to be 1.00 W/m^2

heat leaked by radar P= 10.0 W

uniformly in the all directions.

Intensity at a distance a r is given by

[tex]I= \frac{P}{4\pi r^2}[/tex]

for safe exposure I= 2.5 W/m^2

and P= 10 W

[tex]r=\sqrt{\frac{P}{4\pi I} }[/tex]

[tex]r=\sqrt{\frac{10}{4\pi 2.5} }[/tex]

on calculating we get

r= 0.56 m

at r= 0.56 m far away must you be to be exposed to an average intensity considered to be safe

The distance you must be exposed average intensity to be considered safe is 0.892 m.

Intensity of the sound wave

The intensity of the sound wave is given by the following formula as shown below;

I = P/A

where;

  • P is the power transmitted
  • A is the area of the surrounding

I = P/4πr²

r² = P/4πI

r² = (10) / (4π x 1)

r² = 0.796

r = √0.796

r = 0.892 m

Thus, the distance you must be exposed average intensity to be considered safe is 0.892 m.

Learn more about intensity of sound here: https://brainly.com/question/17062836

Otras preguntas

Q&A Education