In response to @Michael's comment:
The amount of energy emitted by night depends on how much it receives during the day, water content, length and type of grass and other vegetation, how many "past weeks" have there been, and what was the temperature of the ground when the relevant period began.
During a sunny day the ground is said to receive approximately 1 kilowatt per square meter.
Assume grass of 10cm-50cm (about a half foot to a couple feet). There have been thousands of past weeks. Since we need to get picky let's say the temperature has been a constant 10C (50F) every hour of every night and 15C (60F) every hour of every day for the last thousand years except for the 1 or 2 hours day/night transition where the temperature adjusts linearly back and forth if that makes the exercise easier. And the average annual rainfall of this location is 100cm (40in), rain was coming regularly at that rate until 4 days ago. In the last 4 days, it has rained once, yesterday afternoon dropping about 1cm (half-inch), and the water table is generally about 5 feet under ground level.
The dirt is completely covered and appears as you would expect if you Googled an image of a grassy meadow. For example, the picture I've added at the bottom of the question.
In my opinion, however, the best answer would not just say "The ground emits 10 watts per square meter under those exact conditions!" (even that would be useful though) but would instead say "The type of grass actually makes a significant difference that can throw off even a ballpark estimate, as the difference between grass type A and grass type B produces a whopping half-order-of-magnitude difference in final results due to the way their outer material insulates them! And the difference between wet grass and dry grass is actually 2 orders of magnitude! If I were to simply pick a random, reasonable scenario of ABC it would be within the order of magnitude of 0.1 to 1 watts per square meter, and that's on the high end."
For anything else that matters feel free to simply pick a reasonable value and mention (even if only in a few words) why it matters and if your pick produces a high, low, or mid-range estimate.
Diurnal temperature variation is between highest day temperature and coldest night temperature and you're looking for average. Using that number won't give a correct answer, but using, perhaps 1/2 that number might be in the range and that's over land. The diurnal temperature variation over water is much lower.
So, 288 Kelvin Earth, say 12 degree variation day to night, so 294 / 282. Ratio of 1.0425, to the 4th power means 18% more heat is radiated from land during the day than at night. Adjust the temperature if you think mine is off. That's 18% x 29% of the surface area.
For oceans, 71% of the surface area the temperature variation is smaller, say 2.5 degrees. 289.25 / 286.75, ratio of 1.0087. 4th power, means the oceans radiate about 3.5% more energy during the day than at night.
(18% x 29%) + (3.5% x 79%) = about 7.7%
So if Earth's surface radiation global average is about 398.2 w/m^2, then the day/night numbers work out to about 7.7% variation between them, or +/- 15.5 degrees, about 413.7 watts/m^2 during the day and about 382.7 at night. That's a pretty good estimate I think. I'm open to a better estimate if someone can think of a way to do it.