I'm making a Card class and need to make sure that the suit is valid. When I call on the Card class and construct one setting the suit to 'Jordan' it actually ok's that suit and prints 'Ace of Jordan'. Instead it should become the default, of Spades. What am I doing wrong in my code that will not set it to the default suit? TIA guys

class Card:

    def __init__(self, value, suit):
        validSuit(suit)
        self.suit = suit
        self.face = value
        if self.face >= 1 and self.face <= 13:
            if self.face == 1:
                self.face = 'Ace'
            elif self.face == 11:
                self.face = 'Jack'
            elif self.face == 12:
                self.face = 'Queen'
            elif self.face == 13:
                self.face = 'King'
        else:
            self.face = 'Ace'
            
    def getSuit(self):
        return self.suit

    def getFaceValue(self):
        return self.face
        
    def __str__(self): 
        return str(self.face) + ' of ' + str(self.suit)
    
def validSuit(suit):
    if suit != 'Clubs' or suit != 'Spades' or suit != 'Hearts' or suit != 'Diamonds':
        suit = 'Spades'
        return suit
    else:
        return suit

And here is the driver:

from card import *
def main():

    card = Card(14, 'Jordan')
    print(card)
    # prints 'Ace of Jordan'

main()

Recommended Answers

All 17 Replies

You are giving suit as Jordan and you are not using the return value from validSuit (which you should name valid_suit).

I'm making a Card class and need to make sure that the suit is valid. When I call on the Card class and construct one setting the suit to 'Jordan' it actually ok's that suit and prints 'Ace of Jordan'. Instead it should become the default, of Spades. What am I doing wrong in my code that will not set it to the default suit? TIA guys

class Card:

    def __init__(self, value, suit):
        validSuit(suit)
        self.suit = suit
        self.face = value
        if self.face >= 1 and self.face <= 13:
            if self.face == 1:
                self.face = 'Ace'
            elif self.face == 11:
                self.face = 'Jack'
            elif self.face == 12:
                self.face = 'Queen'
            elif self.face == 13:
                self.face = 'King'
        else:
            self.face = 'Ace'
            
    def getSuit(self):
        return self.suit

    def getFaceValue(self):
        return self.face
        
    def __str__(self): 
        return str(self.face) + ' of ' + str(self.suit)
    
def validSuit(suit):
    if suit != 'Clubs' or suit != 'Spades' or suit != 'Hearts' or suit != 'Diamonds':
        suit = 'Spades'
        return suit
    else:
        return suit

And here is the driver:

from card import *
def main():

    card = Card(14, 'Jordan')
    print(card)
    # prints 'Ace of Jordan'

main()

Also, validSuit() should be

def validSuit(suit):
    if suit in ('Clubs', 'Spades', 'Hearts', 'Diamonds'):
        return suit
    else:
        return 'Spades'

Thanks for the cleanup there. When I changed it to that I still get the same results which means the issue is probably in my constructor, like Tony said, and I can't figure that out. It's probably so simple.

Where should the returned value go? Your coding style goes against Python principle not doing correction for mistakes, but raising exception for them.

I thought it should go to the constructor right after the call to validSuit(suit).

Your problem simplified:

def fun(anything):
    return 'Not stupid'

a='Stupid'
fun(a)
print a
a=fun(a)
print a

not following. I see whats going on there, but don't see whats going on with my code. I've tried setting something equal to suit and tried setting suit equal to something. Can't get it

You should either set self.suit to return value or better in class definition not use return:

def validSuit(suit):
    self.suit = suit if suit in ('Clubs','Spades','Hearts','Diamonds') else 'Spades'

And I repeat that I do not like the way it silently puts different value than asked to suit. It makes difficult to debug and read programs.

If this way of behaviour is not a must, have you considered to make suit keyword parameter with default 'Spades'? Then the cards without suit are Spades.

when I set self.suit = suit it still would not work, that's in the original code. When I use your code it doesn't work because of the self in self.suit ---> NameError: global name 'suit' is not defined. I then put 'self' in the parameters for validSuit and it returned: AttributeError: 'Card' object has no attribute 'suit'.

The validSuit should be inside the class and have self parameter.

def validSuit(self,suit):
    self.suit = suit if suit in ('Clubs','Spades','Hearts','Diamonds') else 'Spades'

Right. If you'll double check my post above you'll see that when I do this I get an Attribute Error.

:)


Edit: correction. I didn't have it inside the class when I was attempting this because that was one of the first things I tried (hours and hours ago) and took it back outside the class. The reason I took it out was due to the error below.

Traceback (most recent call last):
  File "/Users/harryluthi/Documents/CSCI/Programs/cardDriver.py", line 8, in <module>
    main()
  File "/Users/harryluthi/Documents/CSCI/Programs/cardDriver.py", line 5, in main
    card = Card(0, 'Jordan')
  File "/Users/harryluthi/Documents/CSCI/Programs/card.py", line 8, in __init__
    validSuit(self, suit)
NameError: global name 'validSuit' is not defined

For me it functions:

class Card:

    def __init__(self, value, suit):
        self.validSuit(suit)
        self.face = value
        if self.face >= 1 and self.face <= 13:
            if self.face == 1:
                self.face = 'Ace'
            elif self.face == 11:
                self.face = 'Jack'
            elif self.face == 12:
                self.face = 'Queen'
            elif self.face == 13:
                self.face = 'King'
        else:
            self.face = 'Ace'

    def validSuit(self,suit):
        self.suit = suit if suit in ('Clubs','Spades','Hearts','Diamonds') else 'Spades'

    def getSuit(self):
        return self.suit

    def getFaceValue(self):
        return self.face

    def __str__(self):
        return str(self.face) + ' of ' + str(self.suit)

def main():

    card = Card(14, 'Jordan')
    print(card)
    # prints 'Ace of Spades'

main()

Oh I see what I was doing wrong. It was in the call to validSuit(). Thanks for posting your code, I immediately knew.

Incorrect:

validSuit(suit)

Correct (they way you have it):

self.validSuit(suit)

For keeping the orthogonality, you should make another function similarly for the validFace from the end part of __init__, and better naming would be to use _ and lowercase (if you do not need to conform other coders standard which is different that Python recommendation)-

Good idea. My course here at the College of Charleston has taught use to use "lower camel case" so it has stuck with me. I take it underscores are the conventional way of doing it?

I have another problem. I have started a Deck class. When I test it, the shell window yells at me saying that there is no module deck to import. I've done the exact same procedure as I did with the Card class earlier (which is currently working 100%). Why won't deck do the same? I can probably test by writing main in (which is commented out below, but I want it to work correctly). Here is the error message and thank you very much for all of your help, I use to rating system as designed.

Error code:

Traceback (most recent call last):
  File "/Users/harryluthi/Documents/CSCI/Programs/testDeck.py", line 6, in <module>
    from deck import *
ImportError: No module named deck

class Deck:

from random import *
from card import *

class Deck:

    def __init__(self, value, suit):
        self.cards = []
        for suit in range(4):
            for value in range(1,14):
                self.cards.append(Card(value,suit))

    def printCards(self):
        for card in self.cards:
            print(card)

    def shuffle(self):
        numCards = len(self.cards)
        for x in range(numCards):
            y = random.randrange(x,numCards)
            [self.cards[x], self.cards[y]] = [self.cards[y], self,cards[x]]


##def main():
##
##    deck = Deck()
##    showCard = str(deck.printCards())
##
##
##main()

deck tester:

from deck import *

def main():

    deck = Deck()
    showCard = str(deck.printCards())


main()

Could you stop using

from module import *

or is it also requirement of class? It is considered generally bad idea leading to namespace polution.

Have you and the lecturer read by the way the must read: http://dirtsimple.org/2004/12/python-is-not-java.html

thats how we're instructed to import. I just quickly read through the article, pretty inerested. oddly enough, about an hour ago I commented out my getters and IIRC, the code worked just fine but I put them back in so that I do not get a deduction in points.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.