From what i could notice, the sql file is imported each time the playbook is run … (aka, the task is marked as changed each time I run the playbook)
is this an expected behavior?
does someone have a recommended way of using the import feature and run it only once?
It looks like the mysql_db module will blindly run mysqldump wiht the specified target file if it’s specified, so yes it’s currently expected behavior.
What you would probably need to do is run an action before you call mysql_db that checks to see if the database already exists and register a variable so that you can call create DB only if you need to. This could be something along the lines of:
-name: check if DB exists
shell: mysql -u -p -h ‘show databases;’ | grep {{ db_name }}
register: dbstatus
If you run that with -vvv you’ll see all the facts available in the dbstatus variable, and one of them can be used in a ‘when’ clause for your mysql_DB call.
I was more thinking of checking the existence of the tables, as the database might exist but be empty …
I guess the way to go would be to query the “information_schema” for the existence of the tables and register the result …
Coming into this discussion a little late, but here’s an example to illustrate what’s being suggested:
Play:
name: Ensure example database exists.
mysql_db: name=exampledb state=present
notify: import example database
Handler:
name: import example database
mysql_db: name=example state=import target=/path/to/example_schema.sql
This works great, and avoids errors I was getting when I tried doing the import straightaway; the import only seems to work if the database already exists.
I have pretty much the same thing in my playbook. It’s a bit nasty though IMO. Plus shell or command always says changed which is useless for handlers. If the value becomes true, then I import in the next task.
A conditional_import (taking the place of import) option on the mysql_db module would be cool. A few things it could do:
Verify there are no tables in the database
If there are tables, for each table, SELECT * FROM table LIMIT 1 and keep a count. If any return 1 than the database is populated, no import. Else import. I don’t know how slow this would be if you have say 300+ tables but regardless it would be useful.
An alternative to 2 could be something like: has_rows=dbname.tablename so basically if tablename has any rows, the database is not considered empty and no import will happen. This could be required for such a conditional_import flag.
I know many other people may not be importing databases often (it is a bi tof a one off task between migrations usualy). But it is something to keep in mind at least with something like a Vagrant VM provisioned with Ansible. Being idempotent, you would want the user to be confident in typing vagrant provision knowing that they will not lose their data they have worked on and need in their test database.
-name: check if DB exists
shell: mysql -u -p -h ‘show databases;’ | grep {{ db_name }}
register: dbstatus
changed_when: dbstatus.rc == 1 and not dbstatus.stdout
Or something like that. Grep returns a 1 in no lines match, and there won’t be any standard out if no lines match.
This might be a good opportunity to create a MySQL facts module that returns information about a server, from which you can use logic to decide how to implement tasks. Just a thought.
“This might be a good opportunity to create a MySQL facts module that returns information about a server,”
While you’re free to do that, not sure we want this – Trying to control over-proliferation of ansible-facts modules if possible as here the register seems pretty easy.
If it returns a giant ton of other things someone needs, that’s one thing – but if it just sounds useful – I’d suggest dragging your feet a bit
I've wanted to create a mysql_query module that would probably help
with a lot of things like this. It could execute an arbitrary query
and then you could register the results. Would make it easier than
having to manipulate mysql command line stdout since you could use
structured data.
Nope, I’m good - I’m not super interested in MySQL so I’m not personally going to take part in writing this module. Might be useful and/or fun for someone to tackle.
Hi, First of all I want to thank you all for this thread, it helped me solve a problem that I was struggling with. This is my first post to the group and want to say hello
I have a slightly different solution to this problem as I was running several tasks in a row, all using with_items on the same array and could not use a handler.
I am building a multi wordpress installation playbook and solved the problem like this:
As you can see. Because there are multiple databases being created via looping a dictionary, I could not see a way to call a handler and know which element of the dictionary had triggered the handler.
The interesting part of this is when looping the result of the create task, you must use {{ item.item.name }} to get at the value.
I am fairly new to ansible so feel free to advise on a better way of doing this if Its not best practice.