Hi guys,

I am trying to find a solid optimization technique for my problem.

I have two tables that grow quote large:

CREATE TABLE  `mydb`.`xm_artist` (
  `artist_id` int(10) unsigned NOT NULL auto_increment,
  `artist_name` varchar(255) NOT NULL,
  PRIMARY KEY  (`artist_id`),
  UNIQUE KEY `index_artist_name` USING BTREE (`artist_name`),
  FULLTEXT KEY `fulltext_artist_name` (`artist_name`)

CREATE TABLE  `mydb`.`xm_artist_tags` (
  `artist_id` int(10) unsigned NOT NULL,
  `similar_artist_id` int(10) unsigned NOT NULL,
  `relevancy` smallint(5) unsigned NOT NULL,
  PRIMARY KEY  USING BTREE (`artist_id`,`similar_artist_id`),
  KEY `index_similar_artist_id` (`similar_artist_id`)

I have about 5,000 records in artists and about 13,400 in tags.

I expect they will double, or quadruple over time.
Does anyone have any idea if there is a way to optimize the model/indexes further?
What about the basic join queries, can they be optimized further?
So far, query speed is not unacceptable, but it would be nice to have them run faster.

Does anyone have any ideas?


'Large' is relative as is 'optimize'. There are lots of MySQL databases handling millions of records / transactions. You can use EXPLAIN to see what your queries are actually doing. Why not wait until you notice a severe reduction in query time, then see if anyone here can help.

Depending on your queries, adding indexes can drastically improve speed.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.