Edit:
I want to add this this is largely speculation, that we really don't know what will happen. I just assume that Earth will share a similar fate to Mars. Mars once had a magnetic field protecting its atmosphere, but as the planet cooled off the field disappeared. Mars's historic magnetic field is an area of contentious research.
As gerrit pointed out, Venus has an atmosphere without a magnetic field, so this is clearly postulation. Perhaps an expert will shed light on this question (How long) would Earth's atmosphere last without a global magnetic field?
So, from a climate perspective, internal heat generation is not important. See also this post on skepticalscience.
However, we might lose our atmosphere, which would have inconvenient consequences. An ice age would be the least of our worries. A subsequent question would be: (How long) would Earth's atmosphere last without a global magnetic field? That is a different question and I'm not sure if we really know the answer.
If you run the two experiments a few times and compare results, you should find that it's practically impossible to detect any difference. The results can be extrapolated to estimate effects on the Earth, and a plausible conclusion is that it won't make a meaningful difference.
The simple reason is that our current geothermal efforts (as well as any currently projected future efforts) are so vanishingly small when compared to the size of the Earth that it has less effect than we can measure.
Now, that doesn't mean that some radical future change in technology won't change things. But no one can answer on those terms except to assert that we could make a significant difference if we could advance technology far enough.
Suppose we have a uniform sphere the size of the Earth. Call it 1021 cubic meters.
Suppose this sphere is made of rock that is four times more dense than water. Water weighs 1000 kg per cubic meter.
Of course the Earth is not uniform; it is made up of rocks that are less dense and metals which are more dense. We're doing some rough math here.
And let's suppose that the interior of our planet is of uniform temperature, say, 5000 Kelvin.
Again, of course the Earth is not uniformly hot throughout. Again, we're doing rough math here, just to get an idea of the order of magnitude involved.
Let's suppose that our ball of rock is not producing new heat. Of course the Earth is producing new heat inside it, for instance, from radioactive elements in the core. But let's suppose that it is not.
And let's suppose that our ball of rock has a specific heat capacity of 0.8 joules per kilogram * kelvin. The specific heat capacity roughly speaking tells us how much energy is in some amount of a substance at a particular temperature. So multiply that out.
(1021 cubic meters) x
(4000 kg / cubic meter) x
(5000 kelvin) x
(0.8 Joules per kilogram * kelvin) =
1.6 x 1028 joules
We're just looking for an order of magnitude here. Our ball of rock has roughly 1028 joules of thermal energy.
Now let's suppose that we extract some amount of those joules. Total energy consumption of humanity from all sources -- nuclear, gas, etc -- is about 1018 joules per year. If we got 100% of that from our ball of hot rock, it would cool it off by one-ten-billionth of its total heat every year.
That's making the worst possible assumptions; of course we do not get anywhere even close to all our power from geothermal, the energy we do get was just going to be wasted into the atmosphere eventually anyway, the earth does make its own heat, and so on. We could get our total power needs met by geothermal energy for trillions of years without worrying about cooling the core.
how does it retain its heat?
The same way anything other ball of rock retains its heat. Heat, like all forms of energy, is retained indefinitely until something acts to remove it. I'm not clear on what question you're actually asking here.
This answer to the question 'Why has Earth's core not become solid?' over on Physics seems to claim the answer is no.
The core is heated by radioactive decays of Uranium-238, Uranium-235, Thorium-232, and Potassium-40, all of which have half-lives of greater than 700 million years (up to about 14 billion years for Thorium).
The core isn't hot just because of remnant heat left over from formation, the heat energy in the core is continually renewed by radioactive processes.
If so, would it result in an ice age?
This energy from the core must already be continually dissipated up through the mantle, through the crust, into the atmosphere and eventually into space (or else the planet would be heating up).
All we could possibly do is speed the dissipation of this energy through the crust, any energy we extract would get to the surface anyway.
As others have pointed out geothermal energy is a tiny fraction of what heats our atmosphere, most of that comes from the sun.
Even if this were not the case, for us to cause an ice age, would require us to have near complete control over geothermal release through artificial means. We would have to extract enough energy over a long enough period from deep enough in the earth that there was no longer significant natural heat dispersion through the crust. Then we would have to stop and bottle up our manual extraction so that the heat had no other means to escape but rising through the crust in the natural way. The heat present in the atmosphere would dissipate into space far quicker than new heat would rise through the crust.
I imagine both the process of our intervening to the point of control, and our sudden relinquishing control would both have significant effects aside from climate change: earthquakes, volcanic eruptions, disrupting continental drift...
If not, how does it retain its heat?
I hope it is clear that it doesn't.
[1] https://en.wikipedia.org/wiki/Earth%27s_internal_heat_budget
As far as I can tell from that wikki page: Current heat in the earth: ~50% radiation, ~50% leftover Internal heat budget: Geothermal Power Consumption + 47TW transferred from the mantle to the crust and beyond[1] - 20TW generated from radiation = Core cooling rate Core cooling rate without geothermal: 0 + 47TW - 20TW = 27TW
The world consumed 22,000 TWh in 2017[2]. That means an average power consumption of 2.5TW. If all of that was geothermal, we'd be increasing the cooling rate of the earth's core by about 10%.
[2] https://yearbook.enerdata.net/electricity/electricity-domestic-consumption-data.html
So based on that, how quickly does each TW of geothermal power cool the earth? Well, I looked at the most abundant elements of the earth by mass, and found that the weighted average thermal capacitance is about 1000 J/kg/deg C. To get a ballpark idea of the impact 1 TW would make, I'll use that number, and an average internal temperature of 3000 degrees C. To calculate that the thermal energy of the earth, I'll use Q=McdT I'll consider a thermal window between 0C and 3000C. The difference in the earth's thermal energy between those points is on the order of 1.8x10^31 J.
In one decade, a 1TW source generates 3.2x10^20 J. In order to have a 1% impact on the average internal temperature of the planet (30 degrees C for our window of analysis), a 1TW source would have to work full time until the sun consumed the earth in 5 billion years.
I think this is awesome! I wanted to see how awesome though.
What about the fact that humans seem to double their power demand every decade or so? I threw together a quick spreadsheet table and simulated it century by century to see how long it took to get measurable effects on the earth's temperature as our power demands go up through time.
It turns out that if we were to convert all our power generation to geothermal today, and double our total global geothermal power generation every decade, we would get 8,400 years of clean energy before cooling the earth's core by 1%!
We would have to make a change soon after though, because if we continued on like that, we'd totally deplete the earth's warmth in centuries. By that time though, we might even have powerful enough technology to reheat the earth artificially.