The dew point cannot be any more meaningful to you as a measure of humidity. Relative humidity is scaled from 0-100% by definition where there will be equal evaporation and condensation rates at 100% relative humidity (so no effective evaporation at 100% RH). It doesn’t matter what the temperature is. If it is 100% RH at 90F or if it is 100% RH at 75F, there will be little to no evaporation of water from your skin/clothes. The dew point does not tell you what the humidity is. The closer the dew point and the dry bulb temperature, the higher the humidity. If the dew point is 65F and the air temperature is 75F, then it is much more humid than if the dew point was 65F and the air temperature was 90F.
So, you do need to know what the air temperature is to calculate humidity from the dew point.
Absolute humidity is the amount of water in the air. The relative humidity is the percentage of the maximum equilibrium concentration of water in the air (at any given temperature). The dew point is the temperature at which water in the air will condense (if the air temperature drops to the dew point, then RH will be 100%).
You know that if RH is 60%, then it’s pretty humid and your sweating isn’t going to be very effective.
If you know the dew point is 60F, then that’s all you really know.
You would want to know the air temperature before going out for your run anyway. To me, I would prefer to know RH since it tells you the potential for evaporation on a nice 0-100 scale regardless of temperature. If you tell me dew point, then I just have to guess at what I think the maximum equilibrium concentration of water in air is at the current temperature.
Bonus points:There is an air temperature where RH is 60% and the dew point is 60F. What is that temperature?