There are specifications for ethernet cables, CAT5, CAT5e, CAT6 etc.
Amongst the specified parameters the standards specify or imply:
- The maximum resistance per meter (i.e. Ohms/meter)
- a minimum value for maximum current load
- a minimum value for maximum working voltage
PoE is able transmit relativly high power (upto 71W - 802.3bt Type 4) through an Ethernet cable by
- increasing the voltage on each pair
- using multiple pairs of cable
Many non-compliant cables are made from CCA (Copper Clad Aluminium) because cladding Aluminium wire in Copper is significantly cheeper than making your wire from pure copper. The resistance of CCA is just too high to achive the maximum cable runs specified in the standards.
This higher resistance means that more power will be lost in the cable which means that the cable will get warm. if you have your cable in a tight coil - such as wound on a spool like you show in your picture, it is possible that the heat build up will be greater than can be radiated into free space (i.e. the power losses in the cable will heat the cable up faster than it can be cooled by the surrounding air). In this situaltion eventualy the cable will get so warm that the insulation will melt and the cable will fail (causing a short circuit between two or more wires in the cable). This may have the consiquence of causing a cascade of failures (i.e. your PoE injector may also fail - especially if this is also not fully standards compliant)
What this boils down to is:
 If an ethernenet cable is properly compliant then yes it is suitable for PoE at all loads.
 Standards specify the ability to work under worse case conditions, as a result your cable may well work for you now, but may not always work.
Of cause beware what you are being sold.... many things claim to be somthing that they are not.
grr this is why I hate 'cheep' cables but then the price being charged for the cable doesn't always reflect on if it is a 'cheep' cable