– AI has the potential to be used for smart contract auditing and cybersecurity in the crypto industry.
– However, current AI models like GPT-4 are not yet proficient in this area and may misclassify high-risk tokens as low-risk.
– OpenAI may have limited the capabilities of its bot to avoid being held responsible for vulnerabilities or exploits.
– While AI can assist in creating smart contracts, it may result in logical code bugs and potential exploits.
– The training data for AI models like GPT-4 is too general, making them better at hacking servers than auditing smart contracts.
– Efforts are underway to train AI models with specific data on smart contract exploits and hacks.
– Projects like WizardCoder and Near are working on AI models that can accurately audit smart contracts and identify vulnerabilities.
– AI is not yet as good as human auditors in this field, but it can speed up the auditing process and make it more comprehensive.
– Smart contract exploits often involve rare edge cases that current AI models struggle to predict accurately.
– It will take a few more years for AI to reach the level of human auditors in smart contract auditing.