Other than as two fellow countries of the English-speaking world that have cultural and historical links.
Politically the USA is an independent secular republic that fought to free itself from the English Empire. It is now fully independent and should have the confidence to stand on its own feet and have an equal relationship with all countries, showing favour to none. The same applies to England. It should have the confidence to stand on its own feet and be truly independent, yet co-operating where necessary.
Some countries today still have not yet freed themselves from the English Empire as the USA did in 1783.
Historically the alliance between the US and England has saved the world. It has also harmed the world in my opinion. Maybe there always will be some kind of special relationship because of the good that this alliance has done. That does not mean we should not be wary of it and that each one should not slavishly follow what the other says.
To show favour to an England that is now part of Europe is to compromise the integrity of a Europe that should show unity where co-operation is necessary.
-----
Why doesn't the USA play the field and have a "special relationship" with another country like France or Spain?