A correlation coefficient (r) is an index of the relatedness of two variables; e.g., how related are two populations or two sets of variables. r can vary between -1 and 1, but an r of 0 means no correlation.
The best way to understand the concept is to think of a simple x, y scattergram. A straight line of dots going up (from left to right, would indicate a correlation of 1, but if the line of dots declined as you go along the x axis, then you would have a correlation coefficient of -1. An r of 0 would be just a shotgun blast of data points on you x,y graph. If there is some relatedness then you'll be able to see a pattern. So if you see a linear trend in data points, you might have an r of 0.7--which indicates that as x goes up a unit of 1, y might go up .7, and but there is some variability along the way (the higher the r value, the less variablility as you approach true linearity).
Your r's of 0.34 and 0.36 indicate somewhat less relatedness (a weak correlation), and that there are likely other factors explaining the variablility between the populations of interest.
[disclaimer, I love stats, but never taught it, so maybe a math teacher could do better]