Well, i want to make an RPG (in the future it will become MMORPG). Anyway...
I'll use C# and i haven't decided yet what graphical API I will use (DirectX, OpenGL, XNA, ..), but this isn't relative for my problem (an opinion would be good).

Well, my doubt is: how I set up (calculate) my attack rate, chance rate, that is, a general formula which calculates that istself based on STR and DEF. I mean, should I use something like Prolog to develop that system and then interact it with C# + graphical API?

Could someone point my in right direction, i mean, some examples, links, tutorials, manuals, whatever essential for my study/progress development.

One thing: don't say - "use a free engine" because the fun of programming is to develop your own things. By the way, it's worth to analyse any engine source code to understand how things work?

Plese, an reply with some useful stuff would be great :)

I expect to *hear* from someone who points me int the best way to do this.


Your question is unclear, so I'll answer all the questions I can discern from your post.....

1. What graphics API should you use?

OGRE3D. It is platform independant (basically) and awesome.

2. How should you calculate attack rate / chance?

That depends... how big of a difference in STR and DEF do you want there to be? In other words, should a character with a STR of 15 be able to have a good chance of hitting a baddie with a DEF of 20? Do you want a character with a STR of 35 to ever miss a baddie with a DEF of 30?

The first thing I would decide is your "limits". For example, let's say that you want to set the maximum chance of a character hitting a baddie at 98% (saying there's always a CHANCE the character will miss), and you want to set the minimum at 2% (saying, likewise, that there's always a CHANCE any character, no matter how weak, can HIT a baddie... doesn't say anything about how much damage he might do).

The next thing you've got to do is define how much of a difference the difference in STR and DEF actually makes. Let's say, for example's sake, you want someone with a STR rating of 10 points below his target's DEF rating to have the minimum chance of hitting, and you want someone with a STR rating equal to that of his opponent to have the maximum chance of hitting. That means that for each point they are off of each other, 10% chance is given or taken away. Let's look at an example......

Attacker has a STR rating of 37
Opponent has a DEF rating of 40

First thing we do is take the Opponent's DEF rating minus 10, which gives us the minimum STR rating for a hit we've decided on of 30. The next thing we do is look at the Attacker's STR rating. It is 37, which is 7 points above our minimum. That gives us a potential hit percentage of 70%.

AT_STR = Attacker Strength
OP_DEF = Opponent Defense
POINT_DIFF = Our maximum decided point differential (in this case 10)


Now let's say that your Attacker's STR is only 27. What you'd do in this case is test the above formula to make sure your result is greater than or equal to the lower limit we established earlier (2%). If it's not, then set ATTACK_PERCENT to .02 (2%) and you're done. The same thing would apply to a situation where the Attacker's STR is 47 (above the Opponent's DEF rating). You'd test to see if ATTACK_PERCENT has gone over our established max (.98, or 98%) and if so, set it back to .98 (98%).

Of course, this is a simplistic view of the problem, but should get you started in the right direction. It also allows you to fine-tune your system as you go, since all you have to do to chance sensitivity is change the POINT_DIFF value.

3. What should you use to impliment your logic?

I'd use LUA, Python, or Perl. I'm not familiar with Prolog, so I can't really speak on it. If you want complete control and you're into problem-solving, try integrating LUA, Perl, or Python into your project for AI and NPC control.