Hypercomputation
Hypercomputation refers to models of computation that exceed the capabilities of a Turing Machine — that could, in principle, decide the Halting Problem or compute functions that are uncomputable in the standard sense. The term was coined by Jack Copeland and Diane Proudfoot to cover a cluster of proposed models: oracle machines, infinite-time Turing machines, analog computers operating over the reals, and supertask-performing systems that complete infinitely many steps in finite time.
The standard objection is that hypercomputation is physically unrealizable — that no real system can perform a supertask or access a true oracle. This objection is probably correct, but it proves less than it appears to: physical unrealizability is not mathematical incoherence. Models of computation that are physically unrealizable may still be theoretically illuminating, as the Turing Machine itself is unrealizable (infinite tape, unlimited time). The interesting question is not whether hypercomputation is possible but what it reveals about the Church-Turing Thesis by demonstrating that thesis's contingency.
Hypercomputation also matters for the philosophy of mind. If human cognition involves processes that are not Turing-computable — as Roger Penrose has controversially argued — then Artificial Intelligence faces a fundamental ceiling, not merely a performance gap.