Yes you can. But going from a useful, good coding practice that makes further development easy to hardcoded, poor coding practice that makes further development hard does not sound like a good idea. Any reason why you would recommend such a move?
It's because sometimes you do exactly what the industry wants. You don't see good coding practice or best practices! Only nerds do that, and they never do more than just coding! The industry needs people who will do the task given to them. Not more than that! Your time is limited and you can't live forever. So instead of looking at best practices its better to finish the task and move on! Besides, this program is not going to win you a Nobel prize so you might as well call it useless. Knowing how to program and how to write algorithms is the essence of programming. You code for a purpose... not for giving boring lectures to your friends or making lame 3/2=1 jokes that make no sense whatsoever. In case, you didn't notice.. all software comes with bugs and fixes are done later (sometimes by users).. especially today we live in open source highway. You code and just distribute it. Only others develop your code to something better but still you maintain that you were the one who made it. There was never a single version of majority of the software in the market, that's because people who see ideas know its not worth wasting more time at it making it perfect. They find bugs later and send patches. That's how the web works. And now cloud computing is gaining pace. Patching and updates will become part of the information highway. So, to have growth you can figure out how things work and describe them as abstractions rather than making something absolute and perfect! If you stick to becoming perfect, then you'll never know what comes next
Imperfection is beauty, madness is genius!!