Does anyone have a good beginner resource guide to regular expressions in Python? I'm used to them in java. For example, in java I have some regular expressions that look like this:

PUNC_MATCH = "[\\d\\p{Punct}]+"
    PUNC_PREFIX = "^" + PUNC_MATCH
    PUNC_SUFFIX = PUNC_MATCH + "$"

Basically its supposed to see punctuation at the beginning or end of a word.
But in python I can't seem to find a reference for how to do this. I don't think its the same as it is in java is it?

Recommended Answers

All 8 Replies

The Dive Into Python book has a great explanation/tutorial/exercises for the re module.

NOTE: It appears that diveintopython.org/ is down ?! I don't know what's going on over there but Google will help you find copies of the book elsewhere, if you're so inclined.

According to docs, strip is supposed to remove all leading and trailing occurrences of characters. So if I t = 'abc123' then shouldn't t.strip('[a-z]') remove all of the leading characters a-z? It just removes the 'a'. Doing t.replace('[a-z]') doesn't do anything.

Ah I see, I have to use re and sub

just wanted to say that i found that that www.diveintopython.org was not down so you can use that if you want.

That's odd. Yesterday it was definitely down. Even Down for everyone or just me gave me

Huh? doesn't look like a site on the interwho.

Strange...

Well no matter i can still get to it. I just tried again and also checked the Down for everyone or just me and got

It's just you. www.diveintopython.org is up.

So yeah sounds quite strange...

Here is an example of re module's sub():

import re

# replace all ;,. characters with _
p = re.compile(r'[;,.]')
s = 'hi;your,face.is;on,fire'
print p.sub('_', s)  # hi_your_face_is_on_fire

or:

import re

s = 'abc123'

p = re.compile("[a-z A-Z]")
# subbing with an empty string "" amounts to strip
print p.sub("", s)  # 123
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.