+ [[ -n '' ]] + echo 13318 + exit 0 + setsid /usr/local/bin/nova-compute --config-file /etc/nova/nova.conf No handlers could be found for logger "oslo_config.cfg" 2015-08-07 16:55:01.449 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:01.820 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:01.980 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:01.982 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:01.986 13318 INFO nova.virt.driver [-] Loading compute driver 'xenapi.XenAPIDriver' 2015-08-07 16:55:02.105 13318 INFO oslo_service.periodic_task [-] Skipping periodic task _periodic_update_dns because its interval is negative /usr/local/lib/python2.7/dist-packages/pkg_resources/__init__.py:203: RuntimeWarning: You have iterated over the result of pkg_resources.parse_version. This is a legacy behavior which is inconsistent with the new version class introduced in setuptools 8.0. In most cases, conversion to a tuple is unnecessary. For comparison of versions, sort the Version instances directly. If you have another use case requiring the tuple, please file a bug with the setuptools project describing that need. stacklevel=1, 2015-08-07 16:55:02.815 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:02.817 13318 DEBUG nova.servicegroup.api [-] ServiceGroup driver defined as an instance of db __init__ /opt/stack/new/nova/nova/servicegroup/api.py:68 2015-08-07 16:55:02.821 13318 DEBUG nova.virt.xenapi.vmops [-] Importing image upload handler: nova.virt.xenapi.image.glance.GlanceStore __init__ /opt/stack/new/nova/nova/virt/xenapi/vmops.py:166 2015-08-07 16:55:02.844 INFO oslo_messaging._drivers.impl_rabbit [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Connecting to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:02.865 INFO oslo_messaging._drivers.impl_rabbit [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Connected to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:02.874 INFO oslo_messaging._drivers.impl_rabbit [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Connecting to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:02.894 INFO oslo_messaging._drivers.impl_rabbit [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Connected to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:02.946 DEBUG oslo_concurrency.lockutils [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Acquired semaphore "singleton_lock" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 16:55:02.947 DEBUG oslo_concurrency.lockutils [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Releasing semaphore "singleton_lock" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 16:55:02.947 DEBUG oslo_concurrency.lockutils [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Acquired semaphore "singleton_lock" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 16:55:02.948 DEBUG oslo_concurrency.lockutils [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Releasing semaphore "singleton_lock" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 16:55:02.949 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Full set of CONF: _wait_for_exit_or_signal /usr/local/lib/python2.7/dist-packages/oslo_service/service.py:251 2015-08-07 16:55:02.950 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ******************************************************************************** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2212 2015-08-07 16:55:02.950 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Configuration options gathered from: log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2213 2015-08-07 16:55:02.950 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] command line args: ['--config-file', '/etc/nova/nova.conf'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2214 2015-08-07 16:55:02.951 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] config files: ['/etc/nova/nova.conf'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2215 2015-08-07 16:55:02.951 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ================================================================================ log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2216 2015-08-07 16:55:02.952 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] allow_resize_to_same_host = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.952 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] allow_same_net_traffic = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.952 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_paste_config = /etc/nova/api-paste.ini log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.953 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_rate_limit = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.953 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] auth_strategy = keystone log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.954 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] auto_assign_floating_ip = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.954 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] backdoor_port = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.954 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] bandwidth_poll_interval = 600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.955 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] bindir = /usr/local/bin log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.955 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] block_device_allocate_retries = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.955 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] block_device_allocate_retries_interval = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.956 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] boot_script_template = /opt/stack/new/nova/nova/cloudpipe/bootscript.template log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.956 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ca_file = cacert.pem log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.957 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ca_path = /opt/stack/data/nova/CA log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.957 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cert_manager = nova.cert.manager.CertManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.958 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] client_socket_timeout = 900 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.958 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cnt_vpn_clients = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.959 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_available_monitors = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.959 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_driver = xenapi.XenAPIDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.959 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_manager = nova.compute.manager.ComputeManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.960 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_monitors = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.960 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_resources = ['vcpu'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.961 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_stats_class = nova.compute.stats.Stats log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.961 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] compute_topic = compute log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.961 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] config_dir = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.962 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] config_drive_format = iso9660 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.962 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.962 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] config_file = ['/etc/nova/nova.conf'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.963 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] console_host = devstack log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.963 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] console_manager = nova.console.manager.ConsoleProxyManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.964 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] console_topic = console log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.964 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] consoleauth_manager = nova.consoleauth.manager.ConsoleAuthManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.964 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] consoleauth_topic = consoleauth log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.965 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] control_exchange = nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.965 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] create_unique_mac_address_attempts = 5 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.966 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] crl_file = crl.pem log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.966 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] db_driver = nova.db log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.966 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] debug = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.967 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_access_ip_network_name = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.967 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_availability_zone = nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.967 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_ephemeral_format = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.968 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_flavor = m1.small log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.968 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_floating_pool = public log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.969 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'glanceclient=WARN'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.969 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_notification_level = INFO log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.969 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_publisher_id = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.970 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] default_schedule_zone = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.970 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] defer_iptables_apply = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.971 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dhcp_domain = novalocal log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.971 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dhcp_lease_time = 86400 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.971 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dhcpbridge = /usr/local/bin/nova-dhcpbridge log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.972 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dhcpbridge_flagfile = ['/etc/nova/nova.conf'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.972 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dmz_cidr = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.973 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dmz_mask = 255.255.255.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.973 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dmz_net = 10.0.0.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.973 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dns_server = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.974 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dns_update_periodic_interval = -1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.975 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] dnsmasq_config_file = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.975 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ebtables_exec_attempts = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.975 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ebtables_retry_interval = 1.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.976 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_listen = 0.0.0.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.980 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_listen_port = 8773 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.981 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_private_dns_show_ip = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.981 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_strict_validation = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.994 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_timestamp_expiry = 300 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.994 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ec2_workers = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.994 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] enable_new_services = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.995 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] enabled_apis = ['ec2', 'osapi_compute', 'metadata'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.995 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] enabled_ssl_apis = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.996 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] fake_call = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.996 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] fake_network = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.997 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] fatal_exception_format_errors = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.997 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] firewall_driver = nova.virt.firewall.NoopFirewallDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.998 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] fixed_ip_disassociate_timeout = 600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.998 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] fixed_range_v6 = fd00::/48 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.998 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] flat_injected = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.999 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] flat_interface = eth3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.999 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] flat_network_bridge = vmnet log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:02.999 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] flat_network_dns = 8.8.4.4 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.000 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] floating_ip_dns_manager = nova.network.noop_dns_driver.NoopDNSDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.000 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] force_config_drive = always log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.001 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] force_dhcp_release = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.001 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] force_raw_images = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.002 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] force_snat_range = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.002 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] forward_bridge_interface = ['all'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.003 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] gateway = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.003 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] gateway_v6 = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.003 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] heal_instance_info_cache_interval = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.004 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] host = devstack log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.004 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] image_cache_manager_interval = 2400 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.005 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] image_cache_subdirectory_name = _base log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.005 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] injected_network_template = /opt/stack/new/nova/nova/virt/interfaces.template log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.006 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_build_timeout = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.006 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_delete_interval = 300 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.006 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_dns_domain = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.007 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_dns_manager = nova.network.noop_dns_driver.NoopDNSDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.007 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_format = [instance: %(uuid)s] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.008 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_name_template = instance-%08x log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.008 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_usage_audit = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.009 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_usage_audit_period = month log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.009 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.009 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] instances_path = /opt/stack/data/nova/instances log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.010 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] internal_service_availability_zone = internal log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.010 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] iptables_bottom_regex = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.011 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] iptables_drop_action = DROP log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.011 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] iptables_top_regex = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.012 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ipv6_backend = rfc2462 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.012 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] key_file = private/cakey.pem log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.013 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] keys_path = /opt/stack/data/nova/keys log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.013 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] keystone_ec2_insecure = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.014 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] keystone_ec2_url = http://192.168.33.1:5000/v2.0/ec2tokens log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.014 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] l3_lib = nova.network.l3.LinuxNetL3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.014 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] linuxnet_interface_driver = nova.network.linux_net.LinuxBridgeInterfaceDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.015 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] linuxnet_ovs_integration_bridge = br-int log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.015 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] live_migration_retry_count = 30 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.016 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] lockout_attempts = 5 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.016 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] lockout_minutes = 15 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.017 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] lockout_window = 15 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.017 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_config_append = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.017 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.018 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_dir = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.018 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.019 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_format = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.019 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] log_options = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.019 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] logging_context_format_string = %(asctime)s.%(msecs)03d %(levelname)s %(name)s [%(request_id)s %(user_name)s %(project_name)s] %(instance)s%(message)s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.020 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.020 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.021 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.021 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] max_age = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.022 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] max_concurrent_builds = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.022 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] max_header_line = 16384 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.022 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] max_local_block_devices = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.023 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] maximum_instance_delete_attempts = 5 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.023 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] memcached_servers = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.025 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_host = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.025 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_listen = 0.0.0.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.027 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_listen_port = 8775 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.028 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_manager = nova.api.manager.MetadataManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.029 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_port = 8775 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.029 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] metadata_workers = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.030 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] migrate_max_retries = -1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.030 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] mkisofs_cmd = genisoimage log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.030 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] monkey_patch = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.031 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] monkey_patch_modules = ['nova.api.ec2.cloud:nova.notifications.notify_decorator', 'nova.compute.api:nova.notifications.notify_decorator'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.031 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] multi_host = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.032 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] multi_instance_display_name_template = %(name)s-%(count)d log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.032 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] my_block_storage_ip = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.033 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] my_ip = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.034 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_allocate_retries = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.034 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_api_class = nova.network.api.API log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.035 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_device_mtu = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.035 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_driver = nova.network.linux_net log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.036 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_manager = nova.network.manager.FlatDHCPManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.036 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_size = 256 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.037 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] network_topic = network log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.037 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] networks_path = /opt/stack/data/nova/networks log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.038 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.038 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] notification_driver = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.039 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] notification_topics = ['notifications'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.041 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] notify_api_faults = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.041 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] notify_on_state_change = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.042 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] null_kernel = nokernel log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.042 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] num_networks = 1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.043 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.043 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_compute_listen_port = 8774 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.044 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_compute_unique_server_name_scope = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.044 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_compute_workers = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.045 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ovs_vsctl_timeout = 120 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.045 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] password_length = 12 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.046 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] pci_alias = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.046 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] pci_passthrough_whitelist = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.047 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] periodic_enable = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.047 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] periodic_fuzzy_delay = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.048 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] policy_default_rule = default log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.048 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] policy_dirs = ['policy.d'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.049 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] policy_file = policy.json log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.049 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] preallocate_images = none log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.051 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] project_cert_subject = /C=US/ST=California/O=OpenStack/OU=NovaDev/CN=project-ca-%.16s-%s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.051 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] public_interface = eth4 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.052 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] publish_errors = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.053 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] pybasedir = /opt/stack/new/nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.053 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_cores = 20 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.054 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_driver = nova.quota.DbQuotaDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.055 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_fixed_ips = -1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.055 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_floating_ips = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.056 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_injected_file_content_bytes = 10240 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.057 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_injected_file_path_length = 255 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.057 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_injected_files = 5 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.063 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_instances = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.068 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_key_pairs = 100 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.068 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_metadata_items = 128 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.069 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_ram = 51200 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.076 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_security_group_rules = 20 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.076 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_security_groups = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.077 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_server_group_members = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.078 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] quota_server_groups = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.078 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] reboot_timeout = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.079 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] reclaim_instance_interval = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.079 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] remove_unused_base_images = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.079 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.080 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] report_interval = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.080 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rescue_timeout = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.081 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] reservation_expire = 86400 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.081 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] reserved_host_disk_mb = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.082 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] reserved_host_memory_mb = 512 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.082 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] resize_confirm_window = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.082 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] resize_fs_using_block_device = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.083 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] resume_guests_state_on_host_boot = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.084 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.084 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] routing_source_ip = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.085 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rpc_backend = rabbit log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.087 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rpc_response_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.088 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] run_external_periodic_tasks = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.088 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] running_deleted_instance_action = reap log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.089 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.089 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] running_deleted_instance_timeout = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.090 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.090 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_default_filters = ['RetryFilter', 'AvailabilityZoneFilter', 'RamFilter', 'ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.090 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_instance_sync_interval = 120 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.091 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_manager = nova.scheduler.manager.SchedulerManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.091 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_max_attempts = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.092 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_topic = scheduler log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.092 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_tracks_instance_changes = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.093 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] scheduler_weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.093 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] secure_proxy_ssl_header = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.096 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] security_group_api = nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.096 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] send_arp_for_ha = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.096 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] send_arp_for_ha_count = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.097 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] service_down_time = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.097 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] servicegroup_driver = db log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.098 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] share_dhcp_address = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.101 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] shelved_offload_time = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.101 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] shelved_poll_interval = 3600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.102 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] shutdown_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.102 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] snapshot_name_template = snapshot-%s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.103 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl_ca_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.103 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl_cert_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.103 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl_key_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.104 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] state_path = /opt/stack/data/nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.104 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] sync_power_state_interval = 600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.105 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] syslog_log_facility = LOG_USER log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.105 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] tcp_keepidle = 600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.105 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] teardown_unused_network_gateway = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.106 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] tempdir = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.108 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] transport_url = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.108 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] until_refresh = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.109 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] update_dns_entries = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.112 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] update_resources_interval = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.113 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_cow_images = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.114 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_forwarded_for = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.114 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_ipv6 = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.114 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_network_dns_servers = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.115 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_project_ca = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.115 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_single_default_gateway = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.116 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_stderr = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.116 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_syslog = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.116 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] use_syslog_rfc_format = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.117 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] user_cert_subject = /C=US/ST=California/O=OpenStack/OU=NovaDev/CN=%.16s-%.16s-%s log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.117 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vcpu_pin_set = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.117 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vendordata_driver = nova.api.metadata.vendordata_json.JsonFileVendorData log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.118 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] verbose = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.118 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vif_plugging_is_fatal = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.119 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vif_plugging_timeout = 300 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.119 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] virt_mkfs = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.119 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vlan_interface = eth3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.120 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vlan_start = 100 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.120 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] volume_api_class = nova.volume.cinder.API log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.121 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] volume_usage_poll_interval = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.121 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vpn_flavor = m1.tiny log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.122 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vpn_image_id = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.122 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vpn_ip = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.122 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vpn_key_suffix = -vpn log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.123 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vpn_start = 1000 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.123 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] wsgi_default_pool_size = 1000 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.124 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] wsgi_keep_alive = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.124 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2225 2015-08-07 16:55:03.124 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.125 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ephemeral_storage_encryption.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.125 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.126 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] keymgr.api_class = nova.keymgr.conf_key_mgr.ConfKeyManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.126 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.allowed_direct_url_schemes = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.127 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.api_insecure = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.127 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.api_servers = ['http://192.168.33.1:9292'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.128 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.host = 192.168.33.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.128 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.num_retries = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.128 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.port = 9292 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.131 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] glance.protocol = http log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.131 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.132 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.132 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.fake_rabbit = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.133 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.133 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.134 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.134 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_reconnect_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.134 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_ssl_ca_certs = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.135 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_ssl_certfile = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.135 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_ssl_keyfile = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.136 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.kombu_ssl_version = log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.136 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.137 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_host = localhost log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.137 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_hosts = ['192.168.33.1'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.138 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.138 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_max_retries = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.139 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_password = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.139 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_port = 5672 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.139 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.140 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.140 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_use_ssl = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.141 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_userid = stackrabbit log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.141 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rabbit_virtual_host = / log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.142 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.142 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_messaging_rabbit.send_single_reply = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.142 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_v3.enabled = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.143 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_v3.extensions_blacklist = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.143 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] osapi_v3.extensions_whitelist = [] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.144 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.agent_path = usr/sbin/xe-update-networking log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.144 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.agent_resetnetwork_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.145 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.agent_timeout = 30 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.145 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.agent_version_timeout = 300 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.145 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.block_device_creation_timeout = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.146 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.cache_images = all log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.146 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.check_host = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.147 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.connection_concurrent = 5 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.147 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.connection_password = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.148 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.connection_url = http://192.168.33.2 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.148 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.connection_username = root log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.149 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.default_os_type = linux log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.149 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.disable_agent = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.149 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.image_compression_level = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.150 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.image_upload_handler = nova.virt.xenapi.image.glance.GlanceStore log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.150 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.introduce_vdi_retry_wait = 20 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.151 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.ipxe_boot_menu_url = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.151 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.ipxe_mkisofs_cmd = mkisofs log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.152 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.ipxe_network_name = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.152 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.iqn_prefix = iqn.2010-10.org.openstack log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.152 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.login_timeout = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.153 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.max_kernel_ramdisk_size = 16777216 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.153 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.num_vbd_unplug_retries = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.154 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.ovs_integration_bridge = xapi1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.154 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.remap_vbd_dev = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.155 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.remap_vbd_dev_prefix = sd log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.155 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.running_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.156 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.sparse_copy = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.156 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.sr_base_path = /var/run/sr-mount log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.156 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.sr_matching_filter = default-sr:true log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.157 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.target_host = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.157 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.target_port = 3260 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.158 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.torrent_images = none log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.158 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.use_agent_default = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.159 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.use_join_force = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.159 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.vhd_coalesce_max_attempts = 20 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.159 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.vhd_coalesce_poll_interval = 5.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.160 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] xenserver.vif_driver = nova.virt.xenapi.vif.XenAPIBridgeDriver log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.160 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] mks.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.161 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.161 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] remote_debug.host = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.161 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] remote_debug.port = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.162 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.baseapi = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.162 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.cells = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.163 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.compute = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.163 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.conductor = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.164 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.console = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.164 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.consoleauth = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.165 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.network = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.165 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] upgrade_levels.scheduler = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.165 WARNING oslo_config.cfg [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Option "vnc_enabled" from group "DEFAULT" is deprecated. Use option "enabled" from group "vnc". 2015-08-07 16:55:03.166 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.166 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.keymap = en-us log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.167 WARNING oslo_config.cfg [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Option "novncproxy_base_url" from group "DEFAULT" is deprecated. Use option "novncproxy_base_url" from group "vnc". 2015-08-07 16:55:03.167 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.novncproxy_base_url = http://192.168.33.1:6080/vnc_auto.html log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.168 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.vncserver_listen = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.168 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.vncserver_proxyclient_address = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.168 WARNING oslo_config.cfg [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] Option "xvpvncproxy_base_url" from group "DEFAULT" is deprecated. Use option "xvpvncproxy_base_url" from group "vnc". 2015-08-07 16:55:03.169 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] vnc.xvpvncproxy_base_url = http://192.168.33.1:6081/console log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.169 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] conductor.manager = nova.conductor.manager.ConductorManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.170 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] conductor.topic = conductor log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.170 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] conductor.use_local = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.170 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] conductor.workers = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.171 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.171 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] serial_console.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.172 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] serial_console.listen = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.172 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] serial_console.port_range = 10000:20000 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.172 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.173 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl.ca_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.173 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl.cert_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.174 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ssl.key_file = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.174 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_concurrency.disable_process_locking = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.174 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_concurrency.lock_path = /opt/stack/data/nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.175 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.agent_enabled = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.175 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.176 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.html5proxy_base_url = http://192.168.33.1:6082/spice_auto.html log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.176 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.keymap = en-us log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.177 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.server_listen = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.177 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.178 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rdp.enabled = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.179 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.179 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_middleware.secure_proxy_ssl_header = X-Forwarded-Proto log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.181 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] workarounds.destroy_after_evacuate = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.181 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] workarounds.disable_libvirt_livesnapshot = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.184 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] workarounds.disable_rootwrap = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.184 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.185 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.bandwidth_update_interval = 600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.186 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.call_timeout = 60 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.186 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.capabilities = ['hypervisor=xenserver;kvm', 'os=linux;windows'] log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.188 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.cell_type = compute log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.188 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.enable = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.189 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.manager = nova.cells.manager.CellsManager log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.189 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.mute_child_interval = 300 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.189 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.name = nova log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.196 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.reserve_percent = 10.0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.197 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cells.topic = cells log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.197 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.backend = sqlalchemy log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.198 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.connection = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.198 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.connection_debug = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.199 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.connection_trace = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.199 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.db_inc_retry_interval = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.199 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.db_max_retries = 20 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.200 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.db_max_retry_interval = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.200 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.db_retry_interval = 1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.201 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.idle_timeout = 3600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.201 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.max_overflow = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.202 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.max_pool_size = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.202 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.max_retries = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.202 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.min_pool_size = 1 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.203 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.203 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.pool_timeout = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.204 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.retry_interval = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.204 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.slave_connection = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.204 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.sqlite_db = nova.sqlite log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.205 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.sqlite_synchronous = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.205 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.use_db_reconnect = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.206 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] database.use_tpool = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.206 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.connection = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.207 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.connection_debug = 0 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.207 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.connection_trace = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.207 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.idle_timeout = 3600 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.208 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.max_overflow = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.208 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.max_pool_size = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.209 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.max_retries = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.210 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.210 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.pool_timeout = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.211 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.retry_interval = 10 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.211 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.slave_connection = **** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.212 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] api_database.sqlite_synchronous = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.212 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.cafile = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.213 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.catalog_info = volumev2:cinderv2:publicURL log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.214 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.certfile = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.220 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.cross_az_attach = True log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.221 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.endpoint_template = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.224 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.http_retries = 3 log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.225 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.insecure = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.225 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.keyfile = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.226 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.os_region_name = RegionOne log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.226 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] cinder.timeout = None log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.227 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2233 2015-08-07 16:55:03.227 DEBUG oslo_service.service [req-4c7dddda-19af-41b8-b17d-95087c294cc3 None None] ******************************************************************************** log_opt_values /usr/local/lib/python2.7/dist-packages/oslo_config/cfg.py:2235 2015-08-07 16:55:03.228 13318 INFO nova.service [-] Starting compute node (version 12.0.0) 2015-08-07 16:55:03.942 ERROR nova.compute.manager [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] No compute node record for host devstack 2015-08-07 16:55:03.943 DEBUG nova.virt.xenapi.host [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:55:04.158 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:55:04.158 DEBUG nova.virt.xenapi.vm_utils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:55:04.572 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.415s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:55:04.876 WARNING nova.compute.monitors [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Excluding CPU monitor virt_driver. Not in the list of enabled monitors (CONF.compute_monitors). 2015-08-07 16:55:04.880 INFO nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 16:55:04.881 DEBUG nova.virt.xenapi.host [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:55:05.145 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:55:05.146 DEBUG nova.virt.xenapi.vm_utils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:55:05.568 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.423s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:55:05.865 DEBUG nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 16:55:05.865 DEBUG nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 16:55:05.866 DEBUG nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=17GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 16:55:05.866 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:55:06.029 WARNING nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] No compute node record for devstack:localhost.localdomain 2015-08-07 16:55:06.105 INFO nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Compute_service record created for devstack:localhost.localdomain 2015-08-07 16:55:06.313 INFO nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 16:55:06.314 INFO nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 16:55:06.416 INFO nova.compute.resource_tracker [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 16:55:06.417 DEBUG oslo_concurrency.lockutils [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.551s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:55:06.417 DEBUG nova.service [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Creating RPC server for service compute start /opt/stack/new/nova/nova/service.py:178 2015-08-07 16:55:06.433 INFO oslo_messaging._drivers.impl_rabbit [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Connecting to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:06.456 INFO oslo_messaging._drivers.impl_rabbit [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Connected to AMQP server on 192.168.33.1:5672 2015-08-07 16:55:06.476 DEBUG nova.service [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] Join ServiceGroup membership for this service compute start /opt/stack/new/nova/nova/service.py:196 2015-08-07 16:55:06.477 DEBUG nova.servicegroup.drivers.db [req-a3b305a0-f36d-4170-96d1-4bb59b4255ac None None] DB_Driver: join new ServiceGroup member devstack to the compute group, service = join /opt/stack/new/nova/nova/servicegroup/drivers/db.py:47 2015-08-07 16:55:11.593 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:55:21.527 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:55:31.535 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:55:36.478 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:55:36.558 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 0 instances in the database and 1 instances on the hypervisor. 2015-08-07 16:55:36.559 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 24.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:55:41.531 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:55:51.551 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:01.500 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.501 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:01.506 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.514 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 16:56:01.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:01.543 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:01.555 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 16:56:01.555 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:56:01.804 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:56:01.805 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:56:02.177 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.373s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:56:02.416 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 16:56:02.417 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 16:56:02.417 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=17GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 16:56:02.418 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:56:02.581 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 16:56:02.582 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 16:56:02.672 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 16:56:02.673 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.255s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:56:02.674 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:02.675 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:02.676 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:56:02.676 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 16:56:02.677 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 16:56:02.739 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 16:56:02.740 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 58.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:11.534 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:21.558 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:31.552 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:41.576 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:56:51.582 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:01.499 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:01.501 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:01.564 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:01.573 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:01.574 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 16:57:01.574 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:01.612 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 16:57:01.613 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:57:01.966 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:02.039 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:57:02.040 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:57:02.417 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.378s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:57:02.641 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 16:57:02.642 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 16:57:02.642 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 16:57:02.643 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:57:02.796 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 16:57:02.797 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 16:57:02.884 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 16:57:02.885 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.242s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:57:02.886 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:02.886 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:02.887 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:02.887 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 16:57:02.887 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 16:57:02.934 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 16:57:02.935 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:03.868 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:03.870 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:03.870 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:57:03.871 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 57.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:11.599 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:21.544 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:32.087 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.40 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:41.853 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:57:52.501 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:01.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:01.510 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 16:58:01.510 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:01.884 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.61 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:02.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:02.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:02.558 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 16:58:02.559 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:58:02.800 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:58:02.801 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:58:03.207 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.407s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:58:03.439 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 16:58:03.440 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 16:58:03.440 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 16:58:03.441 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:58:03.563 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 16:58:03.564 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 16:58:03.798 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 16:58:03.798 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.357s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:58:03.799 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:03.800 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:03.800 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 16:58:03.800 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 16:58:03.854 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 16:58:03.855 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.71 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:04.563 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:04.564 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:04.565 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:04.565 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:05.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:58:05.630 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 56.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:12.862 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.63 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:22.823 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:32.760 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.73 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:41.574 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:58:51.558 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:01.589 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:02.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:02.509 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:03.500 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:03.501 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:03.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:03.510 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 16:59:03.511 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:04.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:04.545 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 16:59:04.546 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 16:59:04.795 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:59:04.795 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 16:59:05.287 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.492s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:59:05.515 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 16:59:05.515 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 16:59:05.516 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 16:59:05.516 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 16:59:05.652 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 16:59:05.653 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 16:59:05.716 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 16:59:05.717 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.201s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 16:59:05.718 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:05.719 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:05.719 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:05.720 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 16:59:05.720 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 16:59:05.784 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 16:59:05.785 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:05.854 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:07.855 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:07.856 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 16:59:07.856 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 53.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:11.588 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:22.059 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:31.571 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:41.886 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.63 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 16:59:51.744 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:01.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:01.510 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:00:01.578 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 0 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:00:01.579 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:01.955 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.57 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:03.501 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:03.502 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:03.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:03.510 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:04.541 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:04.542 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:00:04.542 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:04.770 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:00:04.771 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:00:07.059 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:07.060 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:00:08.074 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:08.432 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:00:08.433 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:00:08.433 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:00:08.434 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:08.644 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:00:08.644 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:00:08.771 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:00:08.772 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.338s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:08.773 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:08.773 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:08.774 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:08.774 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:00:08.775 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:00:08.834 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:00:08.834 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:10.802 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:10.803 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:00:10.803 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 52.70 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:11.580 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:18.544 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:18.772 INFO nova.compute.manager [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Starting instance... 2015-08-07 17:00:19.171 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:19.171 DEBUG nova.compute.resource_tracker [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:00:19.178 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:00:19.179 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 17:00:19.179 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 17:00:19.180 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:00:19.180 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] disk limit not specified, defaulting to unlimited 2015-08-07 17:00:19.235 DEBUG nova.compute.resources.vcpu [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:00:19.236 DEBUG nova.compute.resources.vcpu [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:00:19.236 INFO nova.compute.claims [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Claim successful 2015-08-07 17:00:19.588 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" released by "instance_claim" :: held 0.417s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:20.783 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:20.889 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" released by "update_usage" :: held 0.106s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:20.890 DEBUG nova.compute.utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:00:20.894 13318 DEBUG nova.compute.manager [-] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:00:20.895 13318 DEBUG nova.utils [-] Reloading cached file /etc/nova/policy.json read_cached_file /opt/stack/new/nova/nova/utils.py:1352 2015-08-07 17:00:20.903 13318 DEBUG nova.openstack.common.policy [-] Reloaded policy file: /etc/nova/policy.json _load_policy_file /opt/stack/new/nova/nova/openstack/common/policy.py:295 2015-08-07 17:00:20.904 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-8e4a2813-87d8-4829-bb17-5a0511bfffcb" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:00:20.908 13318 INFO oslo_messaging._drivers.impl_rabbit [-] Connecting to AMQP server on 192.168.33.1:5672 2015-08-07 17:00:20.938 13318 INFO oslo_messaging._drivers.impl_rabbit [-] Connected to AMQP server on 192.168.33.1:5672 2015-08-07 17:00:21.597 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:00:21.599 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:21.612 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:00:21.612 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:21.909 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:00:21.918 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:21.922 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Asking xapi to fetch vhd image a69d8d55-1745-492f-be26-cc75d64fc94d _fetch_vhd_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1425 2015-08-07 17:00:21.943 DEBUG nova.virt.xenapi.client.session [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] glance.download_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:00:26.465 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:26.466 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:00:27.024 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.559s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:27.025 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Fetched VDIs of type 'root' with UUID 'adc68f5c-f9ed-4dfe-862f-8738fb519f84' _fetch_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1361 2015-08-07 17:00:29.575 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Cloned VDI OpaqueRef:eef34df5-9647-6fde-d1a9-379603f9c595 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:00:30.584 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 8.666s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:30.585 INFO nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Image creation data, cacheable: True, downloaded: True duration: 8.67 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:00:31.586 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:31.606 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:32.063 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:32.440 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:00:32.453 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:00:32.453 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:32.821 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Creating disk-type VBD for VM OpaqueRef:b5bde493-4580-3684-f46b-d5f58a7c1fae, VDI OpaqueRef:eef34df5-9647-6fde-d1a9-379603f9c595 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:00:32.830 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Created VBD OpaqueRef:94b7f922-c64d-3eb5-52fe-7450cebc1854 for VM OpaqueRef:b5bde493-4580-3684-f46b-d5f58a7c1fae, VDI OpaqueRef:eef34df5-9647-6fde-d1a9-379603f9c595. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:00:33.482 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Created VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:00:33.495 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:00:33.507 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Created VBD OpaqueRef:67dc3cd8-1902-28e6-c125-9f5d26078fee for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:00:33.508 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Plugging VBD OpaqueRef:67dc3cd8-1902-28e6-c125-9f5d26078fee ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:00:33.509 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:35.541 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.032s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:35.543 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Plugging VBD OpaqueRef:67dc3cd8-1902-28e6-c125-9f5d26078fee done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:00:35.552 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] VBD OpaqueRef:67dc3cd8-1902-28e6-c125-9f5d26078fee plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:00:41.635 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:48.716 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:48.789 INFO nova.compute.manager [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Starting instance... 2015-08-07 17:00:49.077 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:49.078 DEBUG nova.compute.resource_tracker [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:00:49.086 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:00:49.086 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:00:49.087 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:00:49.087 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:00:49.088 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] disk limit not specified, defaulting to unlimited 2015-08-07 17:00:49.108 DEBUG nova.compute.resources.vcpu [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:00:49.109 DEBUG nova.compute.resources.vcpu [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:00:49.109 INFO nova.compute.claims [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Claim successful 2015-08-07 17:00:49.511 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" released by "instance_claim" :: held 0.434s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:49.780 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:49.893 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" released by "update_usage" :: held 0.113s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:49.894 DEBUG nova.compute.utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:00:49.899 13318 DEBUG nova.compute.manager [-] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:00:49.900 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:00:51.020 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:00:51.038 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:00:51.041 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:51.575 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:00:51.591 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:51.648 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:00:53.022 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Cloned VDI OpaqueRef:98a91c28-56df-bf9e-1dc0-d4459366c590 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:00:53.818 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.227s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:53.819 INFO nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Image creation data, cacheable: True, downloaded: False duration: 2.24 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:00:54.619 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:54.926 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:55.220 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:00:55.255 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:00:55.256 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:00:55.525 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:5bd1717d-be70-22ba-ee5e-2862cee919ea, VDI OpaqueRef:98a91c28-56df-bf9e-1dc0-d4459366c590 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:00:55.534 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:8d9b2809-1369-18ca-6e70-e6b75053c91a for VM OpaqueRef:5bd1717d-be70-22ba-ee5e-2862cee919ea, VDI OpaqueRef:98a91c28-56df-bf9e-1dc0-d4459366c590. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:00:56.032 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:00:56.037 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:00:56.049 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:0ac86c17-8fd7-6a85-ca9c-08ec4eb66271 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:00:56.050 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Plugging VBD OpaqueRef:0ac86c17-8fd7-6a85-ca9c-08ec4eb66271 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:00:56.051 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:00:58.669 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.618s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:00:58.670 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Plugging VBD OpaqueRef:0ac86c17-8fd7-6a85-ca9c-08ec4eb66271 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:00:58.677 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VBD OpaqueRef:0ac86c17-8fd7-6a85-ca9c-08ec4eb66271 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:00:59.260 13318 DEBUG nova.network.base_api [-] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:51:f1:f1', 'active': False, 'type': u'bridge', 'id': u'af6e0cdc-912b-419c-8f6e-5930a302deda', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:00:59.290 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-8e4a2813-87d8-4829-bb17-5a0511bfffcb" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:00:59.291 13318 DEBUG nova.compute.manager [-] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:51:f1:f1', 'active': False, 'type': u'bridge', 'id': u'af6e0cdc-912b-419c-8f6e-5930a302deda', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:00:59.295 WARNING nova.virt.configdrive [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:00:59.296 DEBUG nova.objects.instance [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lazy-loading `ec2_ids' on Instance uuid 8e4a2813-87d8-4829-bb17-5a0511bfffcb obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:00:59.439 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Running cmd (subprocess): genisoimage -o /tmp/tmpqRBgcQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpd00vDP execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:00:59.546 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] CMD "genisoimage -o /tmp/tmpqRBgcQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpd00vDP" returned: 0 in 0.106s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:00:59.551 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpqRBgcQ/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:00:59.654 13318 DEBUG nova.network.base_api [-] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:00:59.699 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:00:59.700 13318 DEBUG nova.compute.manager [-] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:00:59.702 WARNING nova.virt.configdrive [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:00:59.703 DEBUG nova.objects.instance [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lazy-loading `ec2_ids' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:00:59.753 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): genisoimage -o /tmp/tmplGAdWD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpq5A8qP execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:00:59.925 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "genisoimage -o /tmp/tmplGAdWD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpq5A8qP" returned: 0 in 0.171s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:00:59.931 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmplGAdWD/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:01:03.507 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:03.509 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:03.598 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:04.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:04.512 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:05.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:05.509 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:01:05.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:05.549 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:01:05.550 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:01:05.899 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:05.900 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:01:06.925 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.026s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:07.247 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:01:07.248 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:01:07.248 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:01:07.249 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:07.527 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:01:07.528 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:01:07.754 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:01:07.755 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.506s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:07.755 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:07.756 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:07.756 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:07.757 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:01:07.757 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:01:07.823 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:01:07.823 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:01:07.824 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:01:07.824 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:07.988 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:08.988 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:08.990 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:11.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:01:11.511 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 53.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:11.739 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:14.247 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpqRBgcQ/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 14.695s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:01:14.251 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:01:15.386 DEBUG oslo_concurrency.processutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.134s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:01:15.389 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Destroying VBD for VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:01:15.391 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:15.984 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmplGAdWD/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 16.053s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:01:15.986 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:01:16.505 DEBUG oslo_concurrency.processutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.519s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:01:16.508 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Destroying VBD for VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:01:16.872 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.482s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:16.875 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.365s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:16.883 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Destroying VBD for VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:01:16.884 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Creating disk-type VBD for VM OpaqueRef:b5bde493-4580-3684-f46b-d5f58a7c1fae, VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:01:16.910 DEBUG nova.virt.xenapi.vm_utils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Created VBD OpaqueRef:d43eabfd-2e65-a8f3-7333-2f6d754f42de for VM OpaqueRef:b5bde493-4580-3684-f46b-d5f58a7c1fae, VDI OpaqueRef:f67db73c-1d1d-fdae-7c88-7fd78b5a9892. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:01:16.912 DEBUG nova.objects.instance [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lazy-loading `pci_devices' on Instance uuid 8e4a2813-87d8-4829-bb17-5a0511bfffcb obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:01:17.050 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:17.640 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:17.641 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:17.642 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:17.658 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "store_auto_disk_config" :: held 0.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:17.660 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Injecting hostname (tempest-test-server-71589265) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:01:17.661 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:17.672 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:17.673 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:01:17.674 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:18.002 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "update_nwinfo" :: held 0.328s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:18.003 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:18.315 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.440s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:18.327 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Destroying VBD for VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:01:18.328 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:5bd1717d-be70-22ba-ee5e-2862cee919ea, VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:01:18.343 DEBUG nova.virt.xenapi.vm_utils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:bc46fb40-76ed-b325-d4bd-0ed644e71a24 for VM OpaqueRef:5bd1717d-be70-22ba-ee5e-2862cee919ea, VDI OpaqueRef:95165fe7-1b0c-17a3-427f-031704ef7fd7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:01:18.347 DEBUG nova.objects.instance [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lazy-loading `pci_devices' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:01:18.492 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:18.663 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:01:18.673 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:01:18.682 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Created VIF OpaqueRef:9199b893-a501-1d06-9105-25bfe571c878, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:01:18.682 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:18.773 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:18.774 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:18.775 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:18.801 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "store_auto_disk_config" :: held 0.026s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:18.801 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Injecting hostname (tempest.common.compute-instance-922478861) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:01:18.802 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:18.814 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "update_hostname" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:18.815 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:01:18.815 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:19.113 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:01:19.143 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "update_nwinfo" :: held 0.328s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:19.144 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:19.403 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:01:19.411 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:01:19.423 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Created VIF OpaqueRef:8a5cd77c-5706-1f80-77f3-ce732169f0d1, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:01:19.425 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:19.663 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:01:21.687 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:31.709 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:34.179 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:01:34.221 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:34.767 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:01:34.768 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:01:34.769 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:34.776 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "xenstore-8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:34.776 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:35.161 DEBUG nova.virt.xenapi.vmops [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:35.480 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:01:35.488 DEBUG nova.compute.manager [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:01:35.540 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:36.248 DEBUG oslo_concurrency.lockutils [req-689a4726-24fd-4d45-8bf6-79e1f67ae74b tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "_locked_do_build_and_run_instance" :: held 77.704s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:36.370 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:01:36.370 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:01:36.371 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:36.387 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "update_hostname" :: held 0.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:36.388 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:36.984 DEBUG nova.virt.xenapi.vmops [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:37.510 DEBUG nova.compute.manager [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:01:38.012 DEBUG oslo_concurrency.lockutils [req-6bd06e2e-ef14-43f6-972f-6dcc111354ee tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" released by "_locked_do_build_and_run_instance" :: held 49.296s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:39.748 DEBUG nova.compute.manager [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:01:40.019 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:40.020 DEBUG nova.compute.resource_tracker [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:01:40.031 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:01:40.032 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:01:40.033 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:01:40.033 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:01:40.033 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] disk limit not specified, defaulting to unlimited 2015-08-07 17:01:40.106 DEBUG nova.compute.resources.vcpu [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:01:40.106 DEBUG nova.compute.resources.vcpu [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:01:40.109 INFO nova.compute.claims [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Claim successful 2015-08-07 17:01:40.172 INFO nova.compute.resource_tracker [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Updating from migration 70a01a0b-d094-423f-b85d-bbceb324656e 2015-08-07 17:01:40.286 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" released by "resize_claim" :: held 0.266s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:40.286 INFO nova.compute.manager [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrating 2015-08-07 17:01:40.433 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:01:40.787 DEBUG nova.network.base_api [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:01:40.887 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:01:41.781 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:41.872 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:42.416 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:01:42.482 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:42.483 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:01:44.992 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.510s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:45.008 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD d5e8528b-c0bc-4efb-b046-260e106a955b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:45.050 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD d5e8528b-c0bc-4efb-b046-260e106a955b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:45.118 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD e7bedd8a-8c78-4b6a-b5c9-9c3117e67ec1 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:45.127 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:45.185 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:01:48.118 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:48.119 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:48.119 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:48.122 INFO nova.compute.manager [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Terminating instance 2015-08-07 17:01:48.124 INFO nova.virt.xenapi.vmops [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Destroying VM 2015-08-07 17:01:48.184 DEBUG nova.virt.xenapi.vm_utils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:01:50.032 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:01:50.053 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:50.054 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:01:50.857 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.804s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:50.904 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD c3a1a88f-63cb-4b2d-b6e2-aaec29bc1b7e has parent e04514d4-fead-4578-961c-4d1799c2dced _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:50.914 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VHD e04514d4-fead-4578-961c-4d1799c2dced has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:01:50.931 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:51.240 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrating VHD 'e04514d4-fead-4578-961c-4d1799c2dced' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:01:51.779 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:01:53.540 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:01:54.254 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:54.539 INFO nova.compute.manager [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Starting instance... 2015-08-07 17:01:55.338 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:55.339 DEBUG nova.compute.resource_tracker [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:01:55.350 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:01:55.351 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:01:55.351 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:01:55.352 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:01:55.352 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] disk limit not specified, defaulting to unlimited 2015-08-07 17:01:55.457 DEBUG nova.compute.resources.vcpu [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:01:55.458 DEBUG nova.compute.resources.vcpu [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:01:55.458 INFO nova.compute.claims [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Claim successful 2015-08-07 17:01:56.475 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" released by "instance_claim" :: held 1.137s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:56.903 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:01:57.008 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" released by "update_usage" :: held 0.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:01:57.010 DEBUG nova.compute.utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:01:57.016 13318 DEBUG nova.compute.manager [-] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:01:57.017 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-45ad1b4a-a958-4bde-b784-c49cfb174b36" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:01:57.149 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:01:57.150 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:01:57.595 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:01:57.615 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:01:58.073 DEBUG nova.virt.xenapi.vmops [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:01:58.124 DEBUG nova.virt.xenapi.vm_utils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] VDI e7bedd8a-8c78-4b6a-b5c9-9c3117e67ec1 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:01:58.246 DEBUG nova.virt.xenapi.vm_utils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] VDI c02a7796-2457-4d84-bd35-299930ba6501 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:01:59.427 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:01:59.445 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:01:59.446 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:01.699 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:02:01.716 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:02.017 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.53 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:04.531 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:04.534 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:05.186 DEBUG nova.virt.xenapi.vmops [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:02:05.277 DEBUG nova.virt.xenapi.vm_utils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:02:05.278 DEBUG nova.compute.manager [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:02:05.501 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:05.502 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:05.701 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:05.702 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:02:05.703 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:06.621 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:06.990 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:02:06.990 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:02:11.829 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:11.830 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:02:13.720 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:16.074 13318 DEBUG nova.network.base_api [-] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d7:53:ea', 'active': False, 'type': u'bridge', 'id': u'a4e0e18c-7e55-4842-bd93-3292db6bb1c8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:02:16.150 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-45ad1b4a-a958-4bde-b784-c49cfb174b36" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:02:16.151 13318 DEBUG nova.compute.manager [-] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d7:53:ea', 'active': False, 'type': u'bridge', 'id': u'a4e0e18c-7e55-4842-bd93-3292db6bb1c8', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:02:19.284 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Cloned VDI OpaqueRef:ec3f53e6-dd0d-2adc-000f-d923bfdd5725 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:02:19.530 DEBUG nova.compute.manager [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:00:17Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=1,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=8e4a2813-87d8-4829-bb17-5a0511bfffcb,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:00:20Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:02:19.806 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:19.807 DEBUG nova.objects.instance [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lazy-loading `numa_topology' on Instance uuid 8e4a2813-87d8-4829-bb17-5a0511bfffcb obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:20.252 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "compute_resources" released by "update_usage" :: held 0.446s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:21.055 DEBUG oslo_concurrency.lockutils [req-78a245c0-49d5-4305-baca-47725060b18c tempest-AggregatesAdminTestJSON-933043542 tempest-AggregatesAdminTestJSON-642246395] Lock "8e4a2813-87d8-4829-bb17-5a0511bfffcb" released by "do_terminate_instance" :: held 32.937s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:21.558 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 9.729s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:21.981 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:22.079 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 20.363s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:22.080 INFO nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Image creation data, cacheable: True, downloaded: False duration: 20.38 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:02:22.330 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:02:22.330 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:02:22.331 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:02:22.331 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:23.109 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 70a01a0b-d094-423f-b85d-bbceb324656e 2015-08-07 17:02:23.110 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `new_flavor' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:23.517 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:02:23.518 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=784MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:02:23.777 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:02:23.778 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.447s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:23.778 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:23.779 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:23.779 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:23.780 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:02:23.780 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:02:23.887 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:02:23.888 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:02:23.888 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:24.909 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:02:25.027 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:02:25.028 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:02:25.029 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:25.809 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:25.917 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:25.918 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:02:25.919 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 38.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:26.277 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Migrating VHD 'd5e8528b-c0bc-4efb-b046-260e106a955b' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:02:27.056 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:27.872 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:02:27.936 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:02:27.938 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:28.383 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Creating disk-type VBD for VM OpaqueRef:cc302322-e022-29c7-1e4a-45f8247b1754, VDI OpaqueRef:ec3f53e6-dd0d-2adc-000f-d923bfdd5725 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:02:28.403 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Created VBD OpaqueRef:909fbb08-7334-ce23-d38a-9c477435b31c for VM OpaqueRef:cc302322-e022-29c7-1e4a-45f8247b1754, VDI OpaqueRef:ec3f53e6-dd0d-2adc-000f-d923bfdd5725. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:02:29.464 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:29.565 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Created VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:02:29.571 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:02:29.581 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Created VBD OpaqueRef:2a7acc7e-b184-c34b-7d1b-9d6582f9d310 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:02:29.582 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Plugging VBD OpaqueRef:2a7acc7e-b184-c34b-7d1b-9d6582f9d310 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:02:29.583 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:29.824 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:31.353 DEBUG nova.virt.xenapi.host [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:02:31.669 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:31.670 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:02:31.763 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:32.867 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.197s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:34.237 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 4.655s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:34.238 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Plugging VBD OpaqueRef:2a7acc7e-b184-c34b-7d1b-9d6582f9d310 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:02:34.250 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] VBD OpaqueRef:2a7acc7e-b184-c34b-7d1b-9d6582f9d310 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:02:34.455 WARNING nova.virt.configdrive [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:02:34.456 DEBUG nova.objects.instance [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lazy-loading `ec2_ids' on Instance uuid 45ad1b4a-a958-4bde-b784-c49cfb174b36 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:34.549 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Running cmd (subprocess): genisoimage -o /tmp/tmpbEW2yK/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpOL4AT9 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:02:35.134 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] CMD "genisoimage -o /tmp/tmpbEW2yK/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpOL4AT9" returned: 0 in 0.585s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:02:35.140 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpbEW2yK/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:02:35.549 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:35.550 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:39.698 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:02:39.943 DEBUG nova.network.base_api [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:02:39.988 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:02:40.623 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:02:40.629 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:02:41.885 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:43.085 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:43.086 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:02:44.381 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.296s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:45.807 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:02:45.824 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:02:45.838 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:b05f647a-3fe2-a3d2-74b1-1151529ca78f, VDI OpaqueRef:3e733a79-7d9a-f6c2-4bac-f2f8cce012d6 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:02:45.848 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:7b4f8cb2-4fa9-43e4-d13e-147bc9249109 for VM OpaqueRef:b05f647a-3fe2-a3d2-74b1-1151529ca78f, VDI OpaqueRef:3e733a79-7d9a-f6c2-4bac-f2f8cce012d6. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:02:46.614 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:02:46.621 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:02:46.635 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:d1dff253-ff7f-bb1d-872a-8da05889234f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:02:46.635 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Plugging VBD OpaqueRef:d1dff253-ff7f-bb1d-872a-8da05889234f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:02:46.636 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:47.224 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpbEW2yK/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 12.084s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:02:47.226 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:02:48.256 DEBUG oslo_concurrency.processutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.030s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:02:48.259 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Destroying VBD for VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:02:49.152 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.516s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:49.154 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Plugging VBD OpaqueRef:d1dff253-ff7f-bb1d-872a-8da05889234f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:02:49.156 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.895s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:49.172 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VBD OpaqueRef:d1dff253-ff7f-bb1d-872a-8da05889234f plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:02:49.302 WARNING nova.virt.configdrive [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:02:49.303 DEBUG nova.objects.instance [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lazy-loading `ec2_ids' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:49.426 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): genisoimage -o /tmp/tmpniGAxT/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpYdizDO execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:02:49.621 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "genisoimage -o /tmp/tmpniGAxT/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpYdizDO" returned: 0 in 0.195s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:02:49.629 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpniGAxT/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:02:51.455 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.299s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:51.467 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Destroying VBD for VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:02:51.468 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Creating disk-type VBD for VM OpaqueRef:cc302322-e022-29c7-1e4a-45f8247b1754, VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:02:51.477 DEBUG nova.virt.xenapi.vm_utils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Created VBD OpaqueRef:f9567240-a439-367c-e6cf-6b6f2001d486 for VM OpaqueRef:cc302322-e022-29c7-1e4a-45f8247b1754, VDI OpaqueRef:a1f6d7b9-c9cd-ac3b-b0ee-4646fda62e32. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:02:51.479 DEBUG nova.objects.instance [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lazy-loading `pci_devices' on Instance uuid 45ad1b4a-a958-4bde-b784-c49cfb174b36 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:02:51.636 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:51.791 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:02:51.998 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:51.999 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:51.999 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:52.012 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:52.013 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Injecting hostname (tempest.common.compute-instance-1898917872) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:02:52.013 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:52.025 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:52.025 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:02:52.026 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:02:52.340 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "update_nwinfo" :: held 0.314s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:02:52.341 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:52.692 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:02:52.707 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:02:52.717 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Created VIF OpaqueRef:f4422bd6-9856-287d-c2ee-614852586bc1, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:02:52.718 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:02:53.095 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:03:01.797 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:03.208 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpniGAxT/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 13.579s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:03:03.210 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:03:04.281 DEBUG oslo_concurrency.processutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.070s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:03:04.296 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Destroying VBD for VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:03:04.298 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:04.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:04.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:06.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:06.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:03:06.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:06.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:06.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:03:06.536 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:03:06.649 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:03:06.651 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:03:06.651 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:03:07.486 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:03:07.530 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:03:07.532 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:03:07.532 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:08.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:08.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:08.571 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:03:08.572 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:03:09.650 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 5.352s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:09.681 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Destroying VBD for VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:03:09.682 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Creating disk-type VBD for VM OpaqueRef:b05f647a-3fe2-a3d2-74b1-1151529ca78f, VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:03:09.733 DEBUG nova.virt.xenapi.vm_utils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Created VBD OpaqueRef:8a4a9d59-faed-3959-c35b-12b80956b0ce for VM OpaqueRef:b05f647a-3fe2-a3d2-74b1-1151529ca78f, VDI OpaqueRef:c594fd46-95cb-91a6-862b-b127d10b22e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:03:09.735 DEBUG nova.objects.instance [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lazy-loading `pci_devices' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:03:09.970 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:09.970 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:09.971 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:10.012 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "store_auto_disk_config" :: held 0.041s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:10.013 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:03:10.013 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:10.481 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:10.482 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:03:11.406 DEBUG oslo_concurrency.lockutils [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "xenstore-70a01a0b-d094-423f-b85d-bbceb324656e" released by "update_nwinfo" :: held 1.393s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:11.407 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:03:11.426 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:03:11.436 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Created VIF OpaqueRef:ae8f86df-376e-5180-ca67-3d648b8a0988, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:03:11.437 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:03:11.946 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:12.803 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.321s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:13.727 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:03:13.728 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:03:13.728 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:03:13.729 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:14.445 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 70a01a0b-d094-423f-b85d-bbceb324656e 2015-08-07 17:03:14.445 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:03:14.995 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:03:14.995 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=784MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:03:15.273 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:03:15.274 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.545s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:15.275 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:15.275 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:15.277 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:15.346 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:15.547 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:03:15.611 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:16.207 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:03:16.208 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:03:16.208 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:16.217 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "xenstore-45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:16.218 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:16.555 DEBUG nova.virt.xenapi.vmops [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:16.846 DEBUG nova.compute.manager [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:03:17.366 DEBUG oslo_concurrency.lockutils [req-a29e3a87-767b-40ce-92e4-9fe193b4db86 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "_locked_do_build_and_run_instance" :: held 83.112s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:20.380 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:20.381 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:20.382 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:20.385 INFO nova.compute.manager [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Terminating instance 2015-08-07 17:03:20.389 INFO nova.virt.xenapi.vmops [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Destroying VM 2015-08-07 17:03:20.410 DEBUG nova.virt.xenapi.vm_utils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:03:21.333 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:03:21.334 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 43.17 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:21.798 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:25.437 DEBUG nova.virt.xenapi.vmops [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:03:25.459 DEBUG nova.virt.xenapi.vm_utils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] VDI ff4f09f1-bbd5-4446-8636-57458bfe1a90 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:03:25.516 DEBUG nova.virt.xenapi.vm_utils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] VDI 6de2e267-a171-4b56-8eee-002d6a722e2d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:03:25.654 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:25.837 INFO nova.compute.manager [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Starting instance... 2015-08-07 17:03:26.146 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:26.147 DEBUG nova.compute.resource_tracker [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:03:26.156 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:03:26.157 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:03:26.157 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:03:26.158 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:03:26.158 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] disk limit not specified, defaulting to unlimited 2015-08-07 17:03:26.183 DEBUG nova.compute.resources.vcpu [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:03:26.184 DEBUG nova.compute.resources.vcpu [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:03:26.184 INFO nova.compute.claims [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Claim successful 2015-08-07 17:03:26.836 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "instance_claim" :: held 0.690s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:27.197 DEBUG nova.virt.xenapi.vmops [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:03:27.264 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:27.348 DEBUG nova.virt.xenapi.vm_utils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:03:27.350 DEBUG nova.compute.manager [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:03:27.358 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:03:27.442 DEBUG nova.virt.xenapi.vmops [req-d408651f-6696-47c3-b636-24bbd04db0c5 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:27.496 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.232s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:27.497 DEBUG nova.compute.utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:03:27.502 13318 DEBUG nova.compute.manager [-] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:03:27.504 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:03:28.528 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:03:28.579 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:03:28.580 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:29.113 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:03:29.142 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:31.581 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:31.582 DEBUG nova.compute.manager [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Going to confirm migration 1 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:03:32.085 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:33.414 DEBUG nova.compute.manager [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:01:53Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=3,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=45ad1b4a-a958-4bde-b784-c49cfb174b36,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:01:57Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:03:33.666 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:77d12225-613d-2bef-cbb1-f360f783d991 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:03:33.710 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:33.711 DEBUG nova.objects.instance [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lazy-loading `numa_topology' on Instance uuid 45ad1b4a-a958-4bde-b784-c49cfb174b36 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:03:34.228 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "compute_resources" released by "update_usage" :: held 0.518s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:36.267 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Acquired semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:03:37.057 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 7.915s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:37.064 INFO nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 7.95 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:03:37.074 DEBUG nova.network.base_api [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:01:ce:c8', 'active': False, 'type': u'bridge', 'id': u'c019c9e9-4973-4b46-8c5d-3535253dedf0', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:03:37.085 DEBUG oslo_concurrency.lockutils [req-832b05bb-adbe-4ec5-b142-478af7d69236 tempest-FixedIPsTestJson-753908824 tempest-FixedIPsTestJson-455161950] Lock "45ad1b4a-a958-4bde-b784-c49cfb174b36" released by "do_terminate_instance" :: held 16.705s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:37.147 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Releasing semaphore "refresh_cache-70a01a0b-d094-423f-b85d-bbceb324656e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:03:37.167 WARNING nova.virt.xenapi.vm_utils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] VM already halted, skipping shutdown... 2015-08-07 17:03:37.224 DEBUG nova.virt.xenapi.vmops [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:03:37.237 DEBUG nova.virt.xenapi.vm_utils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VDI d5e8528b-c0bc-4efb-b046-260e106a955b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:03:37.251 DEBUG nova.virt.xenapi.vm_utils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VDI 99743d47-fb18-4490-ac2e-2091765a8dac is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:03:37.452 13318 DEBUG nova.network.base_api [-] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:03:37.569 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:03:37.738 13318 DEBUG nova.compute.manager [-] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:03:39.163 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:39.432 DEBUG nova.virt.xenapi.vmops [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:03:39.508 DEBUG nova.virt.xenapi.vm_utils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:03:39.556 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:39.600 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:39.689 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" released by "drop_move_claim" :: held 0.089s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:39.955 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:03:39.968 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:03:39.969 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:40.109 DEBUG oslo_concurrency.lockutils [req-5c7638f3-001e-4b0c-ab43-1797aef36769 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" released by "do_confirm_resize" :: held 8.528s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:40.648 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:78df29bf-277b-07e3-4124-b88748c25666, VDI OpaqueRef:77d12225-613d-2bef-cbb1-f360f783d991 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:03:40.659 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:00525c37-fa0c-93e4-bfd2-e03495a50e3a for VM OpaqueRef:78df29bf-277b-07e3-4124-b88748c25666, VDI OpaqueRef:77d12225-613d-2bef-cbb1-f360f783d991. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:03:41.267 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:03:41.271 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:03:41.283 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:54d0ea86-fbf5-cdcd-75c3-4eb31c408c8d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:03:41.284 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:54d0ea86-fbf5-cdcd-75c3-4eb31c408c8d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:03:41.285 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:41.751 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:42.021 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:42.022 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:42.024 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e-events" released by "_clear_events" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:42.026 INFO nova.compute.manager [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Terminating instance 2015-08-07 17:03:42.027 INFO nova.virt.xenapi.vmops [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Destroying VM 2015-08-07 17:03:42.076 DEBUG nova.virt.xenapi.vm_utils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:03:45.436 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:45.929 INFO nova.compute.manager [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Starting instance... 2015-08-07 17:03:46.440 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.155s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:46.441 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:54d0ea86-fbf5-cdcd-75c3-4eb31c408c8d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:03:46.450 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:54d0ea86-fbf5-cdcd-75c3-4eb31c408c8d plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:03:46.581 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:46.581 DEBUG nova.compute.resource_tracker [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:03:46.590 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:03:46.590 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Total memory: 8187 MB, used: 715.00 MB 2015-08-07 17:03:46.591 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] memory limit: 12280.50 MB, free: 11565.50 MB 2015-08-07 17:03:46.591 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:03:46.592 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] disk limit not specified, defaulting to unlimited 2015-08-07 17:03:46.599 WARNING nova.virt.configdrive [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:03:46.600 DEBUG nova.objects.instance [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:03:46.651 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmplHQ0zf/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmppLReg0 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:03:47.393 DEBUG nova.compute.resources.vcpu [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:03:47.397 DEBUG nova.compute.resources.vcpu [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:03:47.398 INFO nova.compute.claims [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Claim successful 2015-08-07 17:03:47.554 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmplHQ0zf/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmppLReg0" returned: 0 in 0.903s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:03:47.558 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmplHQ0zf/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:03:49.572 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" released by "instance_claim" :: held 2.991s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:51.968 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:52.555 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.18 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:03:53.248 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" released by "update_usage" :: held 1.280s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:03:53.249 DEBUG nova.compute.utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:03:53.254 13318 DEBUG nova.compute.manager [-] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:03:53.256 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1045cfe7-db64-498e-a1d4-a956bab07dcf" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:03:54.354 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:03:54.407 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:03:54.410 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:03:55.139 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:03:55.157 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:03:59.024 13318 DEBUG nova.network.base_api [-] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ee:09:d5', 'active': False, 'type': u'bridge', 'id': u'77a34353-9a71-436d-952f-047dfd2422bc', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:03:59.087 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1045cfe7-db64-498e-a1d4-a956bab07dcf" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:03:59.087 13318 DEBUG nova.compute.manager [-] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ee:09:d5', 'active': False, 'type': u'bridge', 'id': u'77a34353-9a71-436d-952f-047dfd2422bc', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:03:59.453 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Cloned VDI OpaqueRef:8972aea3-d003-d525-9a80-0111917968e6 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:04:00.870 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.713s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:00.871 INFO nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Image creation data, cacheable: True, downloaded: False duration: 5.73 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:04:02.467 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.79 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:02.750 DEBUG nova.virt.xenapi.vmops [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:04:02.835 DEBUG nova.virt.xenapi.vm_utils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VDI a4f5df8d-c843-4b05-bf91-c25b18858d25 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:04:02.903 DEBUG nova.virt.xenapi.vm_utils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] VDI 9350465a-b199-4904-ab8c-1e677aa9dcd6 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:04:04.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:05.241 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.27 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:07.781 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:07.782 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:04:07.783 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:04:08.020 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:04:08.348 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:04:08.349 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:04:08.350 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:04:08.351 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:09.072 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:09.073 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:04:09.074 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:09.074 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.43 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:09.090 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:09.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:09.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:09.511 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:09.516 DEBUG nova.virt.xenapi.vmops [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:04:09.548 DEBUG nova.virt.xenapi.vm_utils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:04:09.549 DEBUG nova.compute.manager [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:04:09.599 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:10.008 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:04:10.034 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:04:10.035 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:10.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:10.516 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Creating disk-type VBD for VM OpaqueRef:c6b65872-8607-f256-7650-26736cbf6c8c, VDI OpaqueRef:8972aea3-d003-d525-9a80-0111917968e6 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:04:10.526 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Created VBD OpaqueRef:90607e3c-4396-7afd-4422-8a36ee2067c9 for VM OpaqueRef:c6b65872-8607-f256-7650-26736cbf6c8c, VDI OpaqueRef:8972aea3-d003-d525-9a80-0111917968e6. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:04:10.559 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:04:10.560 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:04:12.090 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:12.091 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:04:12.387 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:12.687 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Created VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:04:12.693 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:04:12.706 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Created VBD OpaqueRef:6cb4b397-4e46-4d15-7911-5d14c8cab825 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:04:12.707 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Plugging VBD OpaqueRef:6cb4b397-4e46-4d15-7911-5d14c8cab825 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:04:12.707 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:13.851 DEBUG nova.compute.manager [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:00:48Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=2,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=70a01a0b-d094-423f-b85d-bbceb324656e,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:00:49Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:04:14.227 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:14.228 DEBUG nova.objects.instance [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lazy-loading `numa_topology' on Instance uuid 70a01a0b-d094-423f-b85d-bbceb324656e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:04:14.521 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "compute_resources" released by "update_usage" :: held 0.294s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:15.380 DEBUG oslo_concurrency.lockutils [req-b8849d0f-57e0-4de5-9b9d-fab091f3e970 tempest-MigrationsAdminTest-668975472 tempest-MigrationsAdminTest-2113731473] Lock "70a01a0b-d094-423f-b85d-bbceb324656e" released by "do_terminate_instance" :: held 33.359s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:16.153 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmplHQ0zf/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 28.596s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:04:16.155 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:04:18.763 DEBUG oslo_concurrency.processutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 2.608s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:04:18.766 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:04:18.816 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 6.108s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:18.817 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Plugging VBD OpaqueRef:6cb4b397-4e46-4d15-7911-5d14c8cab825 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:04:18.818 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.051s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:18.822 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] VBD OpaqueRef:6cb4b397-4e46-4d15-7911-5d14c8cab825 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:04:19.030 WARNING nova.virt.configdrive [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:04:19.031 DEBUG nova.objects.instance [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lazy-loading `ec2_ids' on Instance uuid 1045cfe7-db64-498e-a1d4-a956bab07dcf obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:04:19.095 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Running cmd (subprocess): genisoimage -o /tmp/tmpdQf6gn/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjSG6Gf execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:04:20.081 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] CMD "genisoimage -o /tmp/tmpdQf6gn/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjSG6Gf" returned: 0 in 0.985s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:04:20.087 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpdQf6gn/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:04:21.986 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.167s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:22.001 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:04:22.002 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:78df29bf-277b-07e3-4124-b88748c25666, VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:04:22.044 DEBUG nova.virt.xenapi.vm_utils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:a1817ecc-706b-0e80-e67a-1810aaf19e37 for VM OpaqueRef:78df29bf-277b-07e3-4124-b88748c25666, VDI OpaqueRef:2718286c-dee7-7d10-0aa4-7547ef097c42. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:04:22.046 DEBUG nova.objects.instance [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:04:22.274 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:22.439 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:22.843 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:22.844 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:22.845 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:22.898 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_auto_disk_config" :: held 0.054s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:22.900 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting hostname (tempest-server-395672701) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:04:22.901 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:22.928 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.027s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:22.929 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:04:22.930 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:23.243 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_nwinfo" :: held 0.313s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:23.243 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:23.799 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:04:23.809 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:04:23.824 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VIF OpaqueRef:713d395b-ab8c-5376-d897-64366a8711d3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:04:23.825 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:24.190 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:04:27.218 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 15.128s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:27.697 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:04:27.698 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:04:27.698 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:04:27.699 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:28.107 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:04:28.108 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:04:28.473 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:04:28.474 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.775s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:28.475 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:28.476 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:32.821 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.45 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:34.475 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:04:34.476 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 27.03 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:35.652 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpdQf6gn/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 15.566s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:04:35.698 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:04:37.313 DEBUG oslo_concurrency.processutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.614s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:04:37.320 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Destroying VBD for VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:04:37.322 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:40.226 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.904s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:40.242 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Destroying VBD for VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:04:40.243 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Creating disk-type VBD for VM OpaqueRef:c6b65872-8607-f256-7650-26736cbf6c8c, VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:04:40.265 DEBUG nova.virt.xenapi.vm_utils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Created VBD OpaqueRef:f94d3d4e-494c-e6b5-d462-dceaad93d828 for VM OpaqueRef:c6b65872-8607-f256-7650-26736cbf6c8c, VDI OpaqueRef:582a3b6f-ea30-d87f-83bc-9d313c9a3066. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:04:40.266 DEBUG nova.objects.instance [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lazy-loading `pci_devices' on Instance uuid 1045cfe7-db64-498e-a1d4-a956bab07dcf obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:04:40.536 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:41.520 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:41.525 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "store_meta" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:41.527 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:41.611 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "store_auto_disk_config" :: held 0.084s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:41.611 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Injecting hostname (tempest.common.compute-instance-489275371) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:04:41.613 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:41.639 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "update_hostname" :: held 0.026s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:41.639 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:04:41.640 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:43.032 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.24 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:04:43.240 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "update_nwinfo" :: held 1.600s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:43.241 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:43.262 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:04:43.315 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:44.086 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:04:44.087 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:04:44.087 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:44.099 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:44.100 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:44.114 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:04:44.128 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:04:44.138 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Created VIF OpaqueRef:c4d991bb-2c07-888c-64f1-338075420f74, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:04:44.139 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:44.669 DEBUG nova.virt.xenapi.vmops [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:44.747 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:04:45.043 DEBUG nova.compute.manager [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:04:46.137 DEBUG oslo_concurrency.lockutils [req-bb7ed864-16cb-4ce5-8373-482cf2f2692d tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "_locked_do_build_and_run_instance" :: held 80.483s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:48.866 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:48.981 INFO nova.compute.manager [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Starting instance... 2015-08-07 17:04:49.357 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:49.358 DEBUG nova.compute.resource_tracker [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:04:49.367 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:04:49.371 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:04:49.372 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:04:49.372 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:04:49.373 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] disk limit not specified, defaulting to unlimited 2015-08-07 17:04:49.399 DEBUG nova.compute.resources.vcpu [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:04:49.400 DEBUG nova.compute.resources.vcpu [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:04:49.400 INFO nova.compute.claims [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Claim successful 2015-08-07 17:04:50.095 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "instance_claim" :: held 0.737s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:50.472 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:50.580 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.108s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:04:50.581 DEBUG nova.compute.utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:04:50.586 13318 DEBUG nova.compute.manager [-] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:04:50.588 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:04:51.733 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:04:51.759 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:04:51.760 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:04:52.204 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:04:52.254 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:04:52.760 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:01.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:01.510 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 17:05:11.326 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:ecdd1cf2-ba96-8651-ee3e-c54b7669ec3a from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:05:19.168 ERROR oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Error during ComputeManager._poll_bandwidth_usage 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py", line 218, in run_periodic_tasks 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task task(self, context) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/compute/manager.py", line 5680, in _poll_bandwidth_usage 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task update_cells=update_cells) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 195, in wrapper 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task ctxt, self, fn.__name__, args, kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/rpcapi.py", line 248, in object_action 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task objmethod=objmethod, args=args, kwargs=kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task retry=self.retry) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task timeout=timeout, retry=retry) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 431, in send 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task retry=retry) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 422, in _send 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task raise result 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/manager.py", line 442, in _object_dispatch 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task return getattr(target, method)(*args, **kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 211, in wrapper 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task return fn(self, *args, **kwargs) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 69, in create 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task self._from_db_object(self._context, self, db_bw_usage) 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 42, in _from_db_object 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task bw_usage[field] = db_bw_usage['uuid'] 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.168 13318 ERROR oslo_service.periodic_task 2015-08-07 17:05:19.186 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 6.79 sec 2015-08-07 17:05:19.186 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:19.187 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:19.187 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:05:19.450 13318 DEBUG nova.network.base_api [-] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:05:19.512 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 3 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:05:19.513 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45ad1b4a-a958-4bde-b784-c49cfb174b36] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:05:19.633 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:05:20.347 13318 DEBUG nova.compute.manager [-] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:05:21.422 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 70a01a0b-d094-423f-b85d-bbceb324656e] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:05:21.432 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:22.120 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e4a2813-87d8-4829-bb17-5a0511bfffcb] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:05:22.871 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:22.871 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:22.872 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:05:22.872 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:05:23.111 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:05:23.112 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:05:23.113 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:05:23.113 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:05:23.494 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 31.241s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:23.495 INFO nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 31.29 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:05:23.681 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:05:23.727 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:05:23.729 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:05:23.729 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:23.845 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:23.874 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:24.054 INFO nova.compute.manager [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Starting instance... 2015-08-07 17:05:24.470 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:24.470 DEBUG nova.compute.resource_tracker [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:05:24.480 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:05:24.481 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:05:24.481 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:05:24.482 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:05:24.482 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] disk limit not specified, defaulting to unlimited 2015-08-07 17:05:24.510 DEBUG nova.compute.resources.vcpu [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:05:24.511 DEBUG nova.compute.resources.vcpu [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:05:24.511 INFO nova.compute.claims [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Claim successful 2015-08-07 17:05:25.498 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "instance_claim" :: held 1.028s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:26.103 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:26.276 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "update_usage" :: held 0.173s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:26.277 DEBUG nova.compute.utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:05:26.295 13318 DEBUG nova.compute.manager [-] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:05:26.296 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:05:26.393 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:26.875 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:26.876 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:26.876 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:26.878 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:26.878 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:05:26.878 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:26.917 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:05:26.917 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:05:27.179 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:28.335 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:05:28.339 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:05:28.657 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:05:28.658 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:28.768 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:05:28.769 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:31.092 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:31.093 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:05:31.422 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:31.899 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:05:32.217 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:32.312 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:eb72a9be-ff79-3644-a92a-0d888b35f885, VDI OpaqueRef:ecdd1cf2-ba96-8651-ee3e-c54b7669ec3a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:05:32.342 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:f82e7702-53f1-c15e-1c2d-15681e7fbdaf for VM OpaqueRef:eb72a9be-ff79-3644-a92a-0d888b35f885, VDI OpaqueRef:ecdd1cf2-ba96-8651-ee3e-c54b7669ec3a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:05:33.002 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.910s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:35.217 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:05:35.217 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:05:35.218 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:05:35.218 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:36.166 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:05:36.167 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=788MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:05:36.510 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:05:36.511 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.293s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:36.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:36.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:36.712 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 4 instances in the database and 3 instances on the hypervisor. 2015-08-07 17:05:36.713 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:05:36.714 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 1045cfe7-db64-498e-a1d4-a956bab07dcf _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:05:36.714 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:05:36.714 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 8327722c-cc39-4f7d-acd2-5ffd3f6af05d _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:05:36.715 13318 DEBUG oslo_concurrency.lockutils [-] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:36.717 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:05:36.730 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:05:36.732 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 33.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:37.112 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:37.349 13318 DEBUG oslo_concurrency.lockutils [-] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "query_driver_power_state_and_sync" :: held 0.633s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:38.729 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:05:38.730 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:05:38.730 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:38.740 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "xenstore-1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:38.740 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:39.167 DEBUG nova.virt.xenapi.vmops [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:39.231 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:05:39.237 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:05:39.249 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:f8a04c22-c285-cc7d-5561-c7f0a2d45302 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:05:39.250 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:f8a04c22-c285-cc7d-5561-c7f0a2d45302 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:05:39.251 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:39.370 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:39.583 DEBUG nova.compute.manager [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:05:40.192 13318 DEBUG nova.network.base_api [-] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:bb:67:67', 'active': False, 'type': u'bridge', 'id': u'e2330632-071d-4ab1-a30e-9e9add87c11d', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:05:40.246 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:05:40.247 13318 DEBUG nova.compute.manager [-] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:bb:67:67', 'active': False, 'type': u'bridge', 'id': u'e2330632-071d-4ab1-a30e-9e9add87c11d', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:05:40.273 DEBUG oslo_concurrency.lockutils [req-4e04e90c-7ade-478e-82b6-f1c842536131 tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "_locked_do_build_and_run_instance" :: held 114.838s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:40.274 13318 DEBUG oslo_concurrency.lockutils [-] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "query_driver_power_state_and_sync" :: waited 3.558s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:40.275 13318 INFO nova.compute.manager [-] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:05:40.275 13318 DEBUG oslo_concurrency.lockutils [-] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:42.333 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.083s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:42.334 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:f8a04c22-c285-cc7d-5561-c7f0a2d45302 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:05:42.344 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:f8a04c22-c285-cc7d-5561-c7f0a2d45302 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:05:42.522 WARNING nova.virt.configdrive [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:05:42.523 DEBUG nova.objects.instance [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:05:42.591 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmpglCZv8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRNODrP execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:05:42.764 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmpglCZv8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRNODrP" returned: 0 in 0.173s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:05:42.770 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpglCZv8/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:05:43.144 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:43.145 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:05:43.146 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:43.149 INFO nova.compute.manager [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Terminating instance 2015-08-07 17:05:43.151 INFO nova.virt.xenapi.vmops [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Destroying VM 2015-08-07 17:05:43.225 DEBUG nova.virt.xenapi.vm_utils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:05:51.613 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:54.299 DEBUG nova.virt.xenapi.vmops [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:05:54.324 DEBUG nova.virt.xenapi.vm_utils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] VDI ed080775-9b34-4aa0-956c-fbab999a07e0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:05:54.346 DEBUG nova.virt.xenapi.vm_utils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] VDI ba3cab70-7218-462c-aeac-4180b6f7390a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:05:54.735 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Cloned VDI OpaqueRef:7300b200-ee96-045d-5989-c8e9abd1e944 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:05:55.875 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 23.658s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:05:55.876 INFO nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Image creation data, cacheable: True, downloaded: False duration: 23.98 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:05:58.062 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:58.282 DEBUG nova.virt.xenapi.vmops [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:05:58.312 DEBUG nova.virt.xenapi.vm_utils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:05:58.313 DEBUG nova.compute.manager [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:05:58.656 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:58.978 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:05:59.006 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:05:59.008 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:05:59.311 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:05:59.332 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpglCZv8/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 16.561s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:05:59.334 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:05:59.544 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:e2abe87d-b080-2f4f-20dc-8fa9ca184830, VDI OpaqueRef:7300b200-ee96-045d-5989-c8e9abd1e944 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:05:59.555 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:003a74f9-6d79-6490-05e3-dc87cac2931f for VM OpaqueRef:e2abe87d-b080-2f4f-20dc-8fa9ca184830, VDI OpaqueRef:7300b200-ee96-045d-5989-c8e9abd1e944. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:00.823 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:06:00.832 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:06:00.838 DEBUG oslo_concurrency.processutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.504s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:06:00.839 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:06:00.842 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:00.851 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:c7ea7835-57ca-81be-8515-cec095d28617 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:00.852 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Plugging VBD OpaqueRef:c7ea7835-57ca-81be-8515-cec095d28617 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:06:03.103 DEBUG nova.compute.manager [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:03:44Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=5,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1045cfe7-db64-498e-a1d4-a956bab07dcf,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:03:53Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:06:03.693 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:03.694 DEBUG nova.objects.instance [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lazy-loading `numa_topology' on Instance uuid 1045cfe7-db64-498e-a1d4-a956bab07dcf obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:04.164 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "compute_resources" released by "update_usage" :: held 0.471s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:04.774 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.933s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:04.776 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 3.923s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:04.828 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:06:04.829 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:eb72a9be-ff79-3644-a92a-0d888b35f885, VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:06:04.880 DEBUG nova.virt.xenapi.vm_utils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:64f4cb83-0dfa-52f4-ad77-e56bfd318802 for VM OpaqueRef:eb72a9be-ff79-3644-a92a-0d888b35f885, VDI OpaqueRef:027dc463-d3a9-7534-b68f-8f40dfbddb5e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:04.883 DEBUG nova.objects.instance [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:04.960 DEBUG oslo_concurrency.lockutils [req-b541ed02-5704-42cb-a3d5-b502792d68ca tempest-FixedIPsNegativeTestJson-666106413 tempest-FixedIPsNegativeTestJson-807717977] Lock "1045cfe7-db64-498e-a1d4-a956bab07dcf" released by "do_terminate_instance" :: held 21.816s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:05.151 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:05.682 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:05.683 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:05.683 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:05.748 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "store_auto_disk_config" :: held 0.065s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:05.749 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Injecting hostname (tempest-server-959859648) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:06:05.749 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:05.759 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:05.764 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:06:05.765 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:06.518 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "update_nwinfo" :: held 0.753s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:06.519 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:07.018 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:06:07.071 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:06:07.082 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Created VIF OpaqueRef:1bf256db-57fe-aee2-7591-aaceef4259f6, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:06:07.083 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:07.574 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:06:09.589 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.61 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:09.877 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.101s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:09.878 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Plugging VBD OpaqueRef:c7ea7835-57ca-81be-8515-cec095d28617 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:06:09.882 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VBD OpaqueRef:c7ea7835-57ca-81be-8515-cec095d28617 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:06:10.043 WARNING nova.virt.configdrive [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:06:10.043 DEBUG nova.objects.instance [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `ec2_ids' on Instance uuid 8327722c-cc39-4f7d-acd2-5ffd3f6af05d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:10.146 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): genisoimage -o /tmp/tmpC1p7o8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmprKEsac execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:06:10.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:10.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:10.513 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:06:10.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:10.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:10.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:06:10.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:06:10.565 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "genisoimage -o /tmp/tmpC1p7o8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmprKEsac" returned: 0 in 0.419s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:06:10.569 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpC1p7o8/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:06:11.179 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:06:11.181 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:06:11.183 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:06:11.184 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:11.625 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:06:11.749 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:06:11.750 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:06:11.751 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:12.746 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:12.747 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:12.813 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:06:12.831 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:06:13.439 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:13.463 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:06:15.321 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.882s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:16.023 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:06:16.024 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:06:16.024 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:06:16.025 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:16.387 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:06:16.388 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:06:16.806 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:06:16.807 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.782s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:16.807 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:16.808 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:16.808 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:19.435 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:23.574 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:06:23.577 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 44.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:25.609 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:06:25.758 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:27.007 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:06:27.007 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:06:27.008 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:27.018 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:27.019 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:28.504 DEBUG nova.virt.xenapi.vmops [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:28.942 DEBUG nova.compute.manager [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:06:29.287 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:29.420 DEBUG oslo_concurrency.lockutils [req-30255723-fc12-4837-9a26-e1606f597187 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "_locked_do_build_and_run_instance" :: held 100.553s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:29.421 13318 DEBUG oslo_concurrency.lockutils [-] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "query_driver_power_state_and_sync" :: waited 52.704s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:29.421 13318 INFO nova.compute.manager [-] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:06:29.421 13318 DEBUG oslo_concurrency.lockutils [-] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:30.248 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpC1p7o8/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 19.678s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:06:30.250 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:06:32.231 DEBUG oslo_concurrency.processutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.981s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:06:32.236 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Destroying VBD for VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:06:32.238 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:32.881 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:33.591 INFO nova.compute.manager [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Starting instance... 2015-08-07 17:06:34.099 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:34.101 DEBUG nova.compute.resource_tracker [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:06:34.109 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:06:34.110 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:06:34.111 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:06:34.111 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:06:34.112 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] disk limit not specified, defaulting to unlimited 2015-08-07 17:06:34.634 DEBUG nova.compute.resources.vcpu [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:06:34.634 DEBUG nova.compute.resources.vcpu [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:06:34.635 INFO nova.compute.claims [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Claim successful 2015-08-07 17:06:36.022 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.784s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:36.045 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Destroying VBD for VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:06:36.118 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:e2abe87d-b080-2f4f-20dc-8fa9ca184830, VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:06:36.128 DEBUG nova.virt.xenapi.vm_utils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:118ea987-dfa9-c739-ab93-7bed94aaa9b9 for VM OpaqueRef:e2abe87d-b080-2f4f-20dc-8fa9ca184830, VDI OpaqueRef:a8fdffa1-4fa1-644d-8dfb-772c990d96b2. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:36.129 DEBUG nova.objects.instance [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `pci_devices' on Instance uuid 8327722c-cc39-4f7d-acd2-5ffd3f6af05d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:36.755 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "instance_claim" :: held 2.656s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:36.774 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:37.544 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:37.544 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:37.546 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:37.578 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:37.591 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "store_auto_disk_config" :: held 0.045s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:37.592 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Injecting hostname (tempest-server-2091519663) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:06:37.592 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:37.663 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "update_hostname" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:37.664 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:06:37.665 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:37.909 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.331s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:37.910 DEBUG nova.compute.utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:06:37.914 13318 DEBUG nova.compute.manager [-] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:06:37.915 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:06:38.995 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "update_nwinfo" :: held 1.331s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:38.996 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:39.490 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:39.984 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:06:40.051 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:06:40.072 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Created VIF OpaqueRef:4d0d7b42-b8a2-ed03-e5c6-e8a8e0ae7aa3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:06:40.073 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:40.258 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:06:40.282 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:06:40.283 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:40.581 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:06:40.909 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:06:40.958 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:43.872 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:0073e69a-0c66-cd2d-0271-6988ee11dbfc from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:06:45.242 13318 DEBUG nova.network.base_api [-] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:57:17', 'active': False, 'type': u'bridge', 'id': u'74506357-e4e2-4a19-b942-18e58c964d1e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:06:45.298 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:06:45.325 13318 DEBUG nova.compute.manager [-] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:57:17', 'active': False, 'type': u'bridge', 'id': u'74506357-e4e2-4a19-b942-18e58c964d1e', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:06:45.963 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:45.964 INFO nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 5.05 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:06:47.792 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:48.120 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:48.484 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:06:48.519 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:06:48.519 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:06:49.003 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:4fa4c3fe-4ccb-16b9-83d9-e6c30e102f04, VDI OpaqueRef:0073e69a-0c66-cd2d-0271-6988ee11dbfc ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:06:49.031 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:995755f0-d0a6-854e-1aad-f9be8cbef796 for VM OpaqueRef:4fa4c3fe-4ccb-16b9-83d9-e6c30e102f04, VDI OpaqueRef:0073e69a-0c66-cd2d-0271-6988ee11dbfc. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:49.313 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:06:49.870 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:06:49.880 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:06:49.899 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:2a2da227-1d96-15f1-928a-2400c5a9ca7d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:06:49.900 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:2a2da227-1d96-15f1-928a-2400c5a9ca7d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:06:49.900 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:06:57.741 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 7.841s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:06:57.742 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:2a2da227-1d96-15f1-928a-2400c5a9ca7d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:06:57.745 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:2a2da227-1d96-15f1-928a-2400c5a9ca7d plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:06:58.044 WARNING nova.virt.configdrive [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:06:58.052 DEBUG nova.objects.instance [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid 091727e3-644b-4029-98ad-5a102868d2d5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:06:58.157 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmpftqEo4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpwnSDrE execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:06:58.458 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmpftqEo4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpwnSDrE" returned: 0 in 0.300s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:06:58.463 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpftqEo4/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:06:59.704 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.50 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:03.959 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:07:04.022 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:05.401 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:07:05.402 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:07:05.403 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:05.411 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:05.411 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:06.901 DEBUG nova.virt.xenapi.vmops [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:07.470 DEBUG nova.compute.manager [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:07:08.291 DEBUG oslo_concurrency.lockutils [req-5f91acf7-e28c-47e9-a95e-28e4b39d421f tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "_locked_do_build_and_run_instance" :: held 104.445s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:08.292 13318 DEBUG oslo_concurrency.lockutils [-] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "query_driver_power_state_and_sync" :: waited 91.575s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:08.293 13318 INFO nova.compute.manager [-] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:07:08.293 13318 DEBUG oslo_concurrency.lockutils [-] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:08.509 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:08.511 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:09.319 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:11.503 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:11.504 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:11.619 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:11.625 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:11.627 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:11.628 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:07:11.628 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:07:11.776 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:07:11.908 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:07:11.908 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:07:12.422 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:07:12.452 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:07:12.453 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:07:12.453 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:13.322 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:13.338 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:13.353 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:07:13.358 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:13.409 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:07:13.410 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:07:13.546 INFO nova.compute.manager [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Starting instance... 2015-08-07 17:07:13.851 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:13.852 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:07:14.064 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:14.065 DEBUG nova.compute.resource_tracker [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:07:14.073 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:07:14.073 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:07:14.074 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:07:14.074 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:07:14.075 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] disk limit not specified, defaulting to unlimited 2015-08-07 17:07:14.116 DEBUG nova.compute.resources.vcpu [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:07:14.117 DEBUG nova.compute.resources.vcpu [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:07:14.117 INFO nova.compute.claims [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Claim successful 2015-08-07 17:07:14.756 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "instance_claim" :: held 0.692s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:15.075 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.224s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:15.132 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:15.328 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "update_usage" :: held 0.197s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:15.330 DEBUG nova.compute.utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:07:15.334 13318 DEBUG nova.compute.manager [-] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:07:15.335 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:07:15.481 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:07:15.482 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:07:15.482 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=788MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:07:15.483 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:15.894 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpftqEo4/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 17.430s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:07:15.896 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:07:16.597 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:07:16.607 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:07:16.724 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:07:16.730 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:07:16.731 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.248s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:16.733 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:16.756 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:07:16.757 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:16.774 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.17 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:16.776 DEBUG oslo_concurrency.processutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.881s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:07:16.777 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:07:16.778 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:16.952 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:16.953 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.56 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:17.604 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:07:17.628 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:19.317 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:19.651 13318 DEBUG nova.network.base_api [-] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:20:90:a0', 'active': False, 'type': u'bridge', 'id': u'da93b290-efc9-4d9b-83a5-135670790ec5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:07:19.726 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:07:19.726 13318 DEBUG nova.compute.manager [-] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:20:90:a0', 'active': False, 'type': u'bridge', 'id': u'da93b290-efc9-4d9b-83a5-135670790ec5', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:07:19.741 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.963s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:19.757 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:07:19.758 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:4fa4c3fe-4ccb-16b9-83d9-e6c30e102f04, VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:19.768 DEBUG nova.virt.xenapi.vm_utils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:d16927f8-26a4-4573-8e7c-0a4190f75471 for VM OpaqueRef:4fa4c3fe-4ccb-16b9-83d9-e6c30e102f04, VDI OpaqueRef:474d0530-bef9-ac78-07e4-ad5da8105261. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:19.769 DEBUG nova.objects.instance [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid 091727e3-644b-4029-98ad-5a102868d2d5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:07:19.907 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:20.154 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Cloned VDI OpaqueRef:fe197f03-1702-d85e-ece0-fc2b47cf1762 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:07:20.256 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:20.257 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:20.258 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:20.267 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:20.272 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Injecting hostname (tempest.common.compute-instance-1582059346) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:07:20.273 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:20.284 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:20.285 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:07:20.286 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:20.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:07:20.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 48.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:20.626 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" released by "update_nwinfo" :: held 0.340s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:20.627 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:21.066 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.439s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:21.069 INFO nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Image creation data, cacheable: True, downloaded: False duration: 3.46 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:07:21.109 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:07:21.123 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:07:21.184 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Created VIF OpaqueRef:dd392d50-e002-8e24-fd85-b7c56d27ca87, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:07:21.185 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:21.519 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:07:23.064 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:23.471 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:23.824 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:07:23.845 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:07:23.846 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:24.287 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:14dc4ac4-3c30-45d8-7e7c-6024e859c198, VDI OpaqueRef:fe197f03-1702-d85e-ece0-fc2b47cf1762 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:24.298 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:8bf8001a-222e-5038-e304-dce6460edacd for VM OpaqueRef:14dc4ac4-3c30-45d8-7e7c-6024e859c198, VDI OpaqueRef:fe197f03-1702-d85e-ece0-fc2b47cf1762. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:25.241 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:07:25.250 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:25.264 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:5dd9749d-625d-2abb-a858-7ee3342f515f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:25.266 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Plugging VBD OpaqueRef:5dd9749d-625d-2abb-a858-7ee3342f515f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:07:25.266 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:30.296 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:30.357 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.091s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:30.358 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Plugging VBD OpaqueRef:5dd9749d-625d-2abb-a858-7ee3342f515f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:07:30.371 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VBD OpaqueRef:5dd9749d-625d-2abb-a858-7ee3342f515f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:07:30.483 WARNING nova.virt.configdrive [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:07:30.484 DEBUG nova.objects.instance [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `ec2_ids' on Instance uuid e8fe58be-dd05-484e-8c50-aa4c6aafb4b5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:07:30.594 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): genisoimage -o /tmp/tmpitOZut/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmprefR1u execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:07:30.791 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "genisoimage -o /tmp/tmpitOZut/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmprefR1u" returned: 0 in 0.196s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:07:30.798 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpitOZut/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:07:38.184 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:07:38.237 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:38.995 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:07:38.996 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:07:38.997 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:39.003 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-091727e3-644b-4029-98ad-5a102868d2d5" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:39.004 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:39.304 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:39.585 DEBUG nova.virt.xenapi.vmops [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:40.129 DEBUG nova.compute.manager [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:07:40.839 DEBUG oslo_concurrency.lockutils [req-9c4263aa-3206-4954-836d-ca22f5e94a70 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5" released by "_locked_do_build_and_run_instance" :: held 67.957s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:44.213 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpitOZut/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 13.415s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:07:44.224 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:07:45.343 DEBUG oslo_concurrency.processutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.116s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:07:45.347 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Destroying VBD for VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:07:45.348 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:46.745 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:46.774 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.426s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:46.785 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Destroying VBD for VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:07:46.799 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Creating disk-type VBD for VM OpaqueRef:14dc4ac4-3c30-45d8-7e7c-6024e859c198, VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:46.812 DEBUG nova.virt.xenapi.vm_utils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Created VBD OpaqueRef:7d8c6913-126d-5714-1811-a41fcbc91375 for VM OpaqueRef:14dc4ac4-3c30-45d8-7e7c-6024e859c198, VDI OpaqueRef:f10c0b11-0787-06ee-7287-1bd2da4a8f99. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:46.927 DEBUG nova.objects.instance [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `pci_devices' on Instance uuid e8fe58be-dd05-484e-8c50-aa4c6aafb4b5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:07:46.947 INFO nova.compute.manager [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Starting instance... 2015-08-07 17:07:47.057 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:47.250 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:47.251 DEBUG nova.compute.resource_tracker [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:07:47.260 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:07:47.261 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:07:47.261 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:07:47.262 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:07:47.262 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] disk limit not specified, defaulting to unlimited 2015-08-07 17:07:47.294 DEBUG nova.compute.resources.vcpu [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:07:47.302 DEBUG nova.compute.resources.vcpu [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:07:47.303 INFO nova.compute.claims [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Claim successful 2015-08-07 17:07:47.445 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:47.446 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:47.447 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:47.456 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:47.457 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Injecting hostname (tempest.common.compute-instance-435950911) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:07:47.458 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:47.491 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "update_hostname" :: held 0.033s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:47.496 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:07:47.497 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:47.772 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" released by "instance_claim" :: held 0.522s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:47.870 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "update_nwinfo" :: held 0.373s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:47.870 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:48.040 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:48.202 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:07:48.231 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:07:48.245 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Created VIF OpaqueRef:406b860d-f38b-d247-3308-69ce810eb465, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:07:48.247 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:48.261 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" released by "update_usage" :: held 0.222s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:48.262 DEBUG nova.compute.utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:07:48.271 13318 DEBUG nova.compute.manager [-] [instance: d2671615-627a-4730-bfd6-887d04047dda] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:07:48.272 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-d2671615-627a-4730-bfd6-887d04047dda" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:07:48.285 13318 INFO oslo_messaging._drivers.impl_rabbit [-] Connecting to AMQP server on 192.168.33.1:5672 2015-08-07 17:07:48.329 13318 INFO oslo_messaging._drivers.impl_rabbit [-] Connected to AMQP server on 192.168.33.1:5672 2015-08-07 17:07:48.591 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:07:49.044 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:49.114 INFO nova.compute.manager [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Starting instance... 2015-08-07 17:07:49.347 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:49.452 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:07:49.478 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:07:49.485 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:49.499 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:49.550 DEBUG nova.compute.resource_tracker [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:07:49.558 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:07:49.558 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Total memory: 8187 MB, used: 926.00 MB 2015-08-07 17:07:49.559 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] memory limit: 12280.50 MB, free: 11354.50 MB 2015-08-07 17:07:49.559 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:07:49.560 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] disk limit not specified, defaulting to unlimited 2015-08-07 17:07:49.585 DEBUG nova.compute.resources.vcpu [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:07:49.586 DEBUG nova.compute.resources.vcpu [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:07:49.586 INFO nova.compute.claims [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Claim successful 2015-08-07 17:07:49.950 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:07:49.967 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:50.108 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" released by "instance_claim" :: held 0.609s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:50.720 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:50.950 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" released by "update_usage" :: held 0.230s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:50.951 DEBUG nova.compute.utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:07:50.955 13318 DEBUG nova.compute.manager [-] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:07:50.956 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-abd853db-1da7-45ad-a21f-323100b6d158" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:07:51.852 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:07:51.931 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:07:51.972 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:52.689 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:07:52.894 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Cloned VDI OpaqueRef:b5afc2bb-5791-04a3-7dea-15c01bd335f3 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:07:53.056 13318 DEBUG nova.network.base_api [-] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:1b:d0', 'active': False, 'type': u'bridge', 'id': u'9dfd4d35-f2c6-45d4-a588-ad9e7289ae8a', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:07:53.089 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-d2671615-627a-4730-bfd6-887d04047dda" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:07:53.089 13318 DEBUG nova.compute.manager [-] [instance: d2671615-627a-4730-bfd6-887d04047dda] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:1b:d0', 'active': False, 'type': u'bridge', 'id': u'9dfd4d35-f2c6-45d4-a588-ad9e7289ae8a', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:07:54.606 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.640s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:54.607 INFO nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Image creation data, cacheable: True, downloaded: False duration: 4.66 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:07:54.608 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 1.872s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:56.722 13318 DEBUG nova.network.base_api [-] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:58:02', 'active': False, 'type': u'bridge', 'id': u'd653f7e1-96d9-48e8-8e4d-7c9e1ee0e290', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:07:56.782 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-abd853db-1da7-45ad-a21f-323100b6d158" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:07:56.782 13318 DEBUG nova.compute.manager [-] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:58:02', 'active': False, 'type': u'bridge', 'id': u'd653f7e1-96d9-48e8-8e4d-7c9e1ee0e290', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:07:57.014 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Cloned VDI OpaqueRef:f9e53fcf-2dbb-1d83-4eac-e86dae9b3f92 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:07:57.563 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:57.940 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:58.226 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:07:58.249 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:07:58.250 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:58.307 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.699s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:07:58.308 INFO nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Image creation data, cacheable: True, downloaded: False duration: 5.62 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:07:58.967 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Creating disk-type VBD for VM OpaqueRef:a75c2219-9c8b-3a50-5c25-ef61451a755a, VDI OpaqueRef:b5afc2bb-5791-04a3-7dea-15c01bd335f3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:58.979 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Created VBD OpaqueRef:7cac00bd-6d3a-4216-5ef3-ecbc5491cc9f for VM OpaqueRef:a75c2219-9c8b-3a50-5c25-ef61451a755a, VDI OpaqueRef:b5afc2bb-5791-04a3-7dea-15c01bd335f3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:59.344 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:07:59.513 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:07:59.560 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Created VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:07:59.574 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:07:59.594 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Created VBD OpaqueRef:d0105562-ae1b-a7c0-099b-c7b11bb179ae for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:07:59.595 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Plugging VBD OpaqueRef:d0105562-ae1b-a7c0-099b-c7b11bb179ae ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:07:59.596 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:07:59.858 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:00.191 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:08:00.205 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:08:00.206 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:00.562 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Creating disk-type VBD for VM OpaqueRef:2cd5921b-9983-0159-522f-33ad86fd6d27, VDI OpaqueRef:f9e53fcf-2dbb-1d83-4eac-e86dae9b3f92 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:08:00.571 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Created VBD OpaqueRef:b3d10187-7658-db3f-01d3-8194bd5ec97e for VM OpaqueRef:2cd5921b-9983-0159-522f-33ad86fd6d27, VDI OpaqueRef:f9e53fcf-2dbb-1d83-4eac-e86dae9b3f92. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:08:03.125 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Created VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:08:03.140 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:08:03.166 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Created VBD OpaqueRef:b8364d18-06b4-d519-b520-8d55081726dc for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:08:03.166 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Plugging VBD OpaqueRef:b8364d18-06b4-d519-b520-8d55081726dc ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:08:05.489 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.894s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:05.490 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Plugging VBD OpaqueRef:d0105562-ae1b-a7c0-099b-c7b11bb179ae done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:08:05.492 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 2.325s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:05.496 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] VBD OpaqueRef:d0105562-ae1b-a7c0-099b-c7b11bb179ae plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:08:05.641 WARNING nova.virt.configdrive [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:08:05.642 DEBUG nova.objects.instance [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lazy-loading `ec2_ids' on Instance uuid d2671615-627a-4730-bfd6-887d04047dda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:08:05.689 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Running cmd (subprocess): genisoimage -o /tmp/tmp7jeHOV/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7hPAEM execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:06.000 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] CMD "genisoimage -o /tmp/tmp7jeHOV/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7hPAEM" returned: 0 in 0.311s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:06.006 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7jeHOV/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:08.474 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:08:08.725 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:09.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:09.512 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:09.684 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.53 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:10.042 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:08:10.042 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:08:10.043 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:10.051 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "xenstore-e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:10.052 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:10.631 DEBUG nova.virt.xenapi.vmops [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:10.858 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.366s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:10.858 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Plugging VBD OpaqueRef:b8364d18-06b4-d519-b520-8d55081726dc done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:08:10.865 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] VBD OpaqueRef:b8364d18-06b4-d519-b520-8d55081726dc plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:08:10.967 WARNING nova.virt.configdrive [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:08:10.968 DEBUG nova.objects.instance [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lazy-loading `ec2_ids' on Instance uuid abd853db-1da7-45ad-a21f-323100b6d158 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:08:11.019 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Running cmd (subprocess): genisoimage -o /tmp/tmp2wqL8A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbFSY74 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:11.201 DEBUG nova.compute.manager [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:08:11.272 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] CMD "genisoimage -o /tmp/tmp2wqL8A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbFSY74" returned: 0 in 0.253s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:11.276 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2wqL8A/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:11.970 DEBUG oslo_concurrency.lockutils [req-50704f19-c032-4719-b10a-df7d8c176dd6 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "_locked_do_build_and_run_instance" :: held 58.647s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:12.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:12.512 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:08:12.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:12.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:13.503 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:13.504 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:13.510 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:13.511 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:08:13.754 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:08:14.030 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:08:14.067 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:08:14.068 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:08:14.069 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:15.070 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:15.071 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:15.108 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:08:15.109 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:08:15.609 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:15.687 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:08:17.721 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.111s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:18.518 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -3 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:08:18.519 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:08:18.519 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-3 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:08:18.520 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:19.457 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 11 2015-08-07 17:08:19.458 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=995MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=11 pci_stats=None 2015-08-07 17:08:19.807 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:19.809 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:08:19.810 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.290s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:19.811 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:19.811 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:27.251 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:08:27.253 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 42.26 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:29.425 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:36.963 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7jeHOV/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 30.957s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:36.965 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:40.140 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.14 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:40.398 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2wqL8A/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 29.122s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:40.418 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:08:40.623 DEBUG oslo_concurrency.processutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 3.658s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:40.628 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Destroying VBD for VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:08:40.631 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:42.738 DEBUG oslo_concurrency.processutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 2.319s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:08:42.739 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Destroying VBD for VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:08:42.847 DEBUG nova.compute.manager [req-f1803777-70b9-494c-a3e5-ed866f7e538f tempest-ServersAdminNegativeTestJSON-218174575 tempest-ServersAdminNegativeTestJSON-66463127] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:08:44.150 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.519s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:44.153 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 1.414s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:44.236 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Destroying VBD for VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:08:44.238 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Creating disk-type VBD for VM OpaqueRef:a75c2219-9c8b-3a50-5c25-ef61451a755a, VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:08:44.249 DEBUG nova.virt.xenapi.vm_utils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Created VBD OpaqueRef:ffab08f4-fcf1-54f9-940b-73eb12d24fcb for VM OpaqueRef:a75c2219-9c8b-3a50-5c25-ef61451a755a, VDI OpaqueRef:ae9da926-e762-93ce-e997-1b88c7f4fe91. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:08:44.250 DEBUG nova.objects.instance [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lazy-loading `pci_devices' on Instance uuid d2671615-627a-4730-bfd6-887d04047dda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:08:44.430 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:45.187 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:45.187 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:45.188 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:45.209 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" released by "store_auto_disk_config" :: held 0.021s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:45.210 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Injecting hostname (tempest.common.compute-instance-564359990) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:08:45.211 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:45.221 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:45.222 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:08:45.222 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:45.720 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" released by "update_nwinfo" :: held 0.497s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:45.720 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:46.060 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:08:46.086 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:08:46.111 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Created VIF OpaqueRef:e09dcba7-771d-6d5f-2e4f-c9fa9a58a61f, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:08:46.112 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:46.336 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.183s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:46.344 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Destroying VBD for VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:08:46.356 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Creating disk-type VBD for VM OpaqueRef:2cd5921b-9983-0159-522f-33ad86fd6d27, VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:08:46.367 DEBUG nova.virt.xenapi.vm_utils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Created VBD OpaqueRef:01369f9a-fef3-8253-b5ae-0898d2afb86e for VM OpaqueRef:2cd5921b-9983-0159-522f-33ad86fd6d27, VDI OpaqueRef:5584a842-d3ec-0f2a-9fb6-0ec99c6d72e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:08:46.368 DEBUG nova.objects.instance [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lazy-loading `pci_devices' on Instance uuid abd853db-1da7-45ad-a21f-323100b6d158 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:08:46.388 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:46.389 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:46.389 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:46.391 INFO nova.compute.manager [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Terminating instance 2015-08-07 17:08:46.394 INFO nova.virt.xenapi.vmops [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Destroying VM 2015-08-07 17:08:46.420 DEBUG nova.virt.xenapi.vm_utils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:08:46.622 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:46.920 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:08:47.676 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:47.677 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:47.678 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:47.699 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" released by "store_auto_disk_config" :: held 0.021s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:47.700 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Injecting hostname (tempest-server-1973360572) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:08:47.700 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:47.770 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" released by "update_hostname" :: held 0.069s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:47.771 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:08:47.771 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:48.024 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:48.025 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:08:48.025 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:48.028 INFO nova.compute.manager [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Terminating instance 2015-08-07 17:08:48.030 INFO nova.virt.xenapi.vmops [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Destroying VM 2015-08-07 17:08:48.083 DEBUG nova.virt.xenapi.vm_utils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:08:49.011 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" released by "update_nwinfo" :: held 1.240s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:08:49.012 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:49.642 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:08:49.904 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:08:49.922 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:08:49.934 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Created VIF OpaqueRef:46d3f8bc-b865-2ed3-925a-5f1fef7bae46, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:08:49.935 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:08:50.333 DEBUG nova.virt.xenapi.vmops [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:08:50.420 DEBUG nova.virt.xenapi.vm_utils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VDI 64c3a24a-1f3d-4b57-8dac-1467a28ab860 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:08:50.452 DEBUG nova.virt.xenapi.vm_utils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VDI 200953ee-2202-4d43-a794-a8de4e10beb5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:08:50.519 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:08:55.279 DEBUG nova.virt.xenapi.vmops [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:08:55.327 DEBUG nova.virt.xenapi.vm_utils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:08:55.328 DEBUG nova.compute.manager [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:09:02.299 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:09.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:09.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:11.703 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.60 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:12.503 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:12.658 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:12.667 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:12.668 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:13.199 DEBUG nova.compute.manager [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:07:12Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=9,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=e8fe58be-dd05-484e-8c50-aa4c6aafb4b5,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:07:15Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:09:13.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:13.512 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:09:13.732 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5364 2015-08-07 17:09:13.734 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:09:13.735 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:13.823 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:13.825 DEBUG nova.objects.instance [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `numa_topology' on Instance uuid e8fe58be-dd05-484e-8c50-aa4c6aafb4b5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:09:14.706 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "update_usage" :: held 0.882s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:14.735 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:14.736 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:14.736 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:09:14.737 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:14.818 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:09:14.818 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:09:15.262 DEBUG nova.virt.xenapi.vmops [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:09:15.617 DEBUG nova.virt.xenapi.vm_utils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VDI ed68af0c-ddea-48c0-b4b4-03411246f961 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:09:15.761 DEBUG nova.virt.xenapi.vm_utils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] VDI d77ce013-a432-40b1-9d59-3662760e3737 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:09:16.298 DEBUG oslo_concurrency.lockutils [req-a4cf94bd-9abc-4eda-80ed-b75ead9b0353 tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "e8fe58be-dd05-484e-8c50-aa4c6aafb4b5" released by "do_terminate_instance" :: held 28.274s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:18.634 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:18.635 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:09:19.621 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:22.863 DEBUG nova.virt.xenapi.vmops [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:09:22.905 DEBUG nova.virt.xenapi.vm_utils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:09:22.906 DEBUG nova.compute.manager [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:09:28.233 DEBUG nova.compute.manager [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:05:21Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=7,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=8327722c-cc39-4f7d-acd2-5ffd3f6af05d,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:05:26Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:09:28.598 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:28.599 DEBUG nova.objects.instance [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lazy-loading `numa_topology' on Instance uuid 8327722c-cc39-4f7d-acd2-5ffd3f6af05d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:09:28.806 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "compute_resources" released by "update_usage" :: held 0.208s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:29.359 DEBUG oslo_concurrency.lockutils [req-03ab7f25-1970-457b-aaad-261039327bcf tempest-ServersAdminNegativeTestJSON-1661477185 tempest-ServersAdminNegativeTestJSON-151564255] Lock "8327722c-cc39-4f7d-acd2-5ffd3f6af05d" released by "do_terminate_instance" :: held 42.971s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:29.527 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:44.671 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 4.66 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:47.479 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 28.844s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:49.237 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:09:49.238 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:09:49.238 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:09:49.239 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:50.275 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.05 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:50.343 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:09:50.343 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:09:50.573 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:09:50.573 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.334s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:50.574 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:50.575 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:50.575 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:51.710 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:09:51.768 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:53.298 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:09:53.299 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:09:53.300 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:53.306 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "xenstore-d2671615-627a-4730-bfd6-887d04047dda" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:53.307 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:53.544 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:09:53.731 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:53.910 DEBUG nova.virt.xenapi.vmops [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:54.141 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:09:54.141 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:09:54.142 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:54.155 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "xenstore-abd853db-1da7-45ad-a21f-323100b6d158" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:54.155 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:54.374 DEBUG nova.compute.manager [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:09:54.832 DEBUG nova.virt.xenapi.vmops [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:09:55.164 DEBUG oslo_concurrency.lockutils [req-b2890484-4995-41d7-810b-34156f3eb046 tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda" released by "_locked_do_build_and_run_instance" :: held 128.418s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:55.430 DEBUG nova.compute.manager [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:09:56.098 DEBUG oslo_concurrency.lockutils [req-0e3edac7-a714-4340-8f29-ce262b0a7c30 tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158" released by "_locked_do_build_and_run_instance" :: held 127.053s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:57.439 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:57.439 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:09:57.440 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:09:57.442 INFO nova.compute.manager [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Terminating instance 2015-08-07 17:09:57.443 INFO nova.virt.xenapi.vmops [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Destroying VM 2015-08-07 17:09:57.454 DEBUG nova.virt.xenapi.vm_utils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:09:58.351 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:09:58.352 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 12.16 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:09:59.667 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.66 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:00.436 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:00.440 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda-events" acquired by "_clear_events" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:00.441 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:00.443 INFO nova.compute.manager [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Terminating instance 2015-08-07 17:10:00.445 INFO nova.virt.xenapi.vmops [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Destroying VM 2015-08-07 17:10:00.544 DEBUG nova.virt.xenapi.vm_utils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:10:07.814 INFO nova.compute.manager [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Rebuilding instance 2015-08-07 17:10:08.198 DEBUG nova.compute.manager [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:10:08.672 INFO nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VM 2015-08-07 17:10:08.723 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:10:10.008 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.32 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:10.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:10.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:13.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:13.512 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:14.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:14.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:15.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:15.518 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:10:15.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:15.707 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:10:15.708 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:10:17.852 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:17.857 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:10:21.674 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.66 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:22.851 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 4.999s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:24.062 DEBUG nova.virt.xenapi.vmops [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:10:24.174 DEBUG nova.virt.xenapi.vm_utils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] VDI fcd25943-d265-43c5-aff6-4322d91e190d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:24.196 DEBUG nova.virt.xenapi.vm_utils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] VDI 31af1a37-6924-4951-a06c-225676db61ba is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:25.274 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -3 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:10:25.274 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:10:25.275 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-3 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:10:25.275 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:26.918 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 11 2015-08-07 17:10:26.918 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=11 pci_stats=None 2015-08-07 17:10:27.704 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:10:27.705 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 2.429s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:27.705 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:27.706 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:27.706 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:10:27.929 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 3 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:10:27.943 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e8fe58be-dd05-484e-8c50-aa4c6aafb4b5] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:10:28.259 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8327722c-cc39-4f7d-acd2-5ffd3f6af05d] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:10:28.845 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1045cfe7-db64-498e-a1d4-a956bab07dcf] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:10:30.259 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.13 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:30.325 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:30.326 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:30.326 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:10:30.327 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:10:30.424 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: d2671615-627a-4730-bfd6-887d04047dda] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:10:30.425 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:10:30.426 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:10:30.427 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:10:31.189 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:10:31.222 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:10:31.222 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:10:31.223 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:32.222 DEBUG nova.virt.xenapi.vmops [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:10:32.247 DEBUG nova.virt.xenapi.vm_utils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:10:32.247 DEBUG nova.compute.manager [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:10:34.875 DEBUG nova.virt.xenapi.vmops [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:10:34.915 DEBUG nova.virt.xenapi.vm_utils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] VDI cd21a757-82c0-4976-b295-a5774ccc8816 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:34.969 DEBUG nova.virt.xenapi.vm_utils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] VDI e9530445-eeba-4f32-add6-d318f18b5ad7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:39.650 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:10:39.654 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 31.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:40.891 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:47.281 DEBUG nova.compute.manager [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] [instance: abd853db-1da7-45ad-a21f-323100b6d158] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:07:47Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=11,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=abd853db-1da7-45ad-a21f-323100b6d158,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:07:50Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:10:47.486 DEBUG nova.virt.xenapi.vmops [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:10:47.506 DEBUG nova.virt.xenapi.vm_utils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:10:47.507 DEBUG nova.compute.manager [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:10:47.710 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:47.712 DEBUG nova.objects.instance [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lazy-loading `numa_topology' on Instance uuid abd853db-1da7-45ad-a21f-323100b6d158 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:10:47.959 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "compute_resources" released by "update_usage" :: held 0.249s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:48.720 DEBUG oslo_concurrency.lockutils [req-904bc3f8-4ac9-4123-86de-cd1b94431d0e tempest-ServersAdminTestJSON-1163270417 tempest-ServersAdminTestJSON-1577712347] Lock "abd853db-1da7-45ad-a21f-323100b6d158" released by "do_terminate_instance" :: held 51.281s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:49.712 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:10:53.034 DEBUG nova.compute.manager [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] [instance: d2671615-627a-4730-bfd6-887d04047dda] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:07:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=10,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=d2671615-627a-4730-bfd6-887d04047dda,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:07:48Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:10:53.333 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:10:53.354 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 7d0b4cc3-db2a-4ab4-b2f5-7e72a28fd2b2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:53.378 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 2c09930a-a6da-45f3-9ae0-f04fcd42bbd2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:10:53.497 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:53.499 DEBUG nova.objects.instance [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lazy-loading `numa_topology' on Instance uuid d2671615-627a-4730-bfd6-887d04047dda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:10:54.246 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "compute_resources" released by "update_usage" :: held 0.748s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:55.412 DEBUG oslo_concurrency.lockutils [req-765294ae-368d-4c1f-a835-9ecb2899217d tempest-TenantUsagesTestJSON-1123076320 tempest-TenantUsagesTestJSON-1570943165] Lock "d2671615-627a-4730-bfd6-887d04047dda" released by "do_terminate_instance" :: held 54.976s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:10:58.756 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:10:58.798 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:10:58.918 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:10:59.007 INFO nova.compute.manager [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Starting instance... 2015-08-07 17:11:00.029 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:00.029 DEBUG nova.compute.resource_tracker [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:11:00.037 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:11:00.038 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:11:00.039 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:11:00.039 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:11:00.039 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] disk limit not specified, defaulting to unlimited 2015-08-07 17:11:00.060 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:00.107 DEBUG nova.compute.resources.vcpu [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:11:00.107 DEBUG nova.compute.resources.vcpu [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:11:00.108 INFO nova.compute.claims [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Claim successful 2015-08-07 17:11:00.138 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:11:00.181 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:11:00.181 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:00.530 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:11:00.560 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:00.576 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "instance_claim" :: held 0.547s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:00.974 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:01.186 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "update_usage" :: held 0.212s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:01.188 DEBUG nova.compute.utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:11:01.192 13318 DEBUG nova.compute.manager [-] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:11:01.194 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:11:03.203 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:11:03.272 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:11:03.273 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:03.595 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:11:04.287 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:4790b447-d82a-fa28-ec90-992cf93d6868 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:11:06.473 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.913s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:06.474 INFO nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 5.94 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:11:06.475 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 2.826s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:08.837 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:09.467 INFO nova.compute.manager [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Starting instance... 2015-08-07 17:11:09.776 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:10.107 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:10.108 DEBUG nova.compute.resource_tracker [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:11:10.120 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:11:10.121 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:11:10.121 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:11:10.122 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:11:10.122 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] disk limit not specified, defaulting to unlimited 2015-08-07 17:11:10.206 DEBUG nova.compute.resources.vcpu [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:11:10.206 DEBUG nova.compute.resources.vcpu [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:11:10.207 INFO nova.compute.claims [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Claim successful 2015-08-07 17:11:11.205 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:11.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:11.512 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:11.569 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Cloned VDI OpaqueRef:7a39fdc4-04b9-0568-4124-ddef0d566621 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:11:11.815 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:11.822 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" released by "instance_claim" :: held 1.715s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:12.198 13318 DEBUG nova.network.base_api [-] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:ca:df', 'active': False, 'type': u'bridge', 'id': u'3b0bca93-d9b8-4771-a84a-cc726fe3e3be', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:11:12.205 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:11:12.237 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:11:12.238 13318 DEBUG nova.compute.manager [-] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:ca:df', 'active': False, 'type': u'bridge', 'id': u'3b0bca93-d9b8-4771-a84a-cc726fe3e3be', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:11:12.251 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:11:12.252 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:12.307 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:12.427 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" released by "update_usage" :: held 0.119s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:12.429 DEBUG nova.compute.utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:11:12.435 13318 DEBUG nova.compute.manager [-] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:11:12.437 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-573af27d-723a-48c5-9a9a-6a818abbbfc4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:11:12.786 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:83d31aa2-49e6-94a5-9ec6-b378cb4e4022, VDI OpaqueRef:4790b447-d82a-fa28-ec90-992cf93d6868 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:12.816 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:3f3ad970-b703-8fd4-049f-a6a0b5c7e5a0 for VM OpaqueRef:83d31aa2-49e6-94a5-9ec6-b378cb4e4022, VDI OpaqueRef:4790b447-d82a-fa28-ec90-992cf93d6868. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:13.258 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 6.782s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:13.259 INFO nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Image creation data, cacheable: True, downloaded: False duration: 9.66 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:11:13.423 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:11:13.459 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:11:13.460 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:13.859 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:11:13.912 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:13.986 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:11:13.991 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:14.065 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:cb57453b-a063-d545-2b99-f0a7f9e2330e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:14.066 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:cb57453b-a063-d545-2b99-f0a7f9e2330e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:11:14.068 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:14.504 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:14.777 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:15.536 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:15.787 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:15.788 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:15.789 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:11:16.000 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:11:16.401 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:11:16.431 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:11:16.431 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:11:16.432 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:16.552 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:17.156 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:17.157 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:17.174 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:11:17.188 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:11:17.189 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:17.203 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:11:17.204 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:11:18.798 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:427e9846-4db6-2d5e-2171-2aca976e093d, VDI OpaqueRef:7a39fdc4-04b9-0568-4124-ddef0d566621 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:18.904 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:5691fd5b-e86a-b578-a7d4-10bd8f86f6e0 for VM OpaqueRef:427e9846-4db6-2d5e-2171-2aca976e093d, VDI OpaqueRef:7a39fdc4-04b9-0568-4124-ddef0d566621. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:19.555 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:19.556 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:11:21.722 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.27 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:24.933 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:11:25.001 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:25.024 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:4c3a431a-6a83-4429-887a-a9dd57e54ffa for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:25.025 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Plugging VBD OpaqueRef:4c3a431a-6a83-4429-887a-a9dd57e54ffa ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:11:27.215 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 13.147s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:27.216 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:cb57453b-a063-d545-2b99-f0a7f9e2330e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:11:27.217 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 2.192s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:27.224 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:cb57453b-a063-d545-2b99-f0a7f9e2330e plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:11:27.343 WARNING nova.virt.configdrive [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:11:27.344 DEBUG nova.objects.instance [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:11:27.388 13318 DEBUG nova.network.base_api [-] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c8:7e:ca', 'active': False, 'type': u'bridge', 'id': u'73614a83-7655-49b3-9dd8-d6efef9f4186', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:11:27.445 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmp6T9vJN/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpK1HrrU execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:27.596 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-573af27d-723a-48c5-9a9a-6a818abbbfc4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:11:27.598 13318 DEBUG nova.compute.manager [-] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c8:7e:ca', 'active': False, 'type': u'bridge', 'id': u'73614a83-7655-49b3-9dd8-d6efef9f4186', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:11:27.616 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmp6T9vJN/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpK1HrrU" returned: 0 in 0.171s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:27.621 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp6T9vJN/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:28.046 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 8.491s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:28.553 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:11:28.556 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:11:28.556 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:11:28.557 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:29.020 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:11:29.021 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:11:29.460 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:11:29.461 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.904s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:29.462 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:29.462 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:29.463 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.35 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:29.837 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:29.838 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:11:29.838 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:11:29.839 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 42.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:30.146 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:31.061 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.843s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:31.061 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Plugging VBD OpaqueRef:4c3a431a-6a83-4429-887a-a9dd57e54ffa done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:11:31.066 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VBD OpaqueRef:4c3a431a-6a83-4429-887a-a9dd57e54ffa plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:11:31.212 WARNING nova.virt.configdrive [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:11:31.213 DEBUG nova.objects.instance [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `ec2_ids' on Instance uuid 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:11:31.256 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): genisoimage -o /tmp/tmpK_cFQO/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpCYA6pq execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:31.472 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "genisoimage -o /tmp/tmpK_cFQO/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpCYA6pq" returned: 0 in 0.216s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:31.477 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpK_cFQO/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:37.510 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Cloned VDI OpaqueRef:c8017ecb-5a56-3182-7669-43cb32b42d9e from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:11:39.729 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 25.817s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:39.730 INFO nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Image creation data, cacheable: True, downloaded: False duration: 25.87 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:11:40.286 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.70 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:41.178 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:41.758 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:42.293 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:11:42.305 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:11:42.306 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:42.656 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Creating disk-type VBD for VM OpaqueRef:ecd36add-349e-2b17-bbc3-2a72ba6c7242, VDI OpaqueRef:c8017ecb-5a56-3182-7669-43cb32b42d9e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:42.686 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Created VBD OpaqueRef:a879413a-7247-9bda-7a4d-f25891a7a5c7 for VM OpaqueRef:ecd36add-349e-2b17-bbc3-2a72ba6c7242, VDI OpaqueRef:c8017ecb-5a56-3182-7669-43cb32b42d9e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:42.834 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp6T9vJN/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 15.214s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:42.836 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:44.302 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Created VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:11:44.381 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:44.424 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Created VBD OpaqueRef:d5a5630f-9b69-5316-1c07-6c8620868c18 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:44.425 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Plugging VBD OpaqueRef:d5a5630f-9b69-5316-1c07-6c8620868c18 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:11:44.426 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:44.493 DEBUG oslo_concurrency.processutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.657s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:44.494 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:11:48.215 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.789s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:48.216 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Plugging VBD OpaqueRef:d5a5630f-9b69-5316-1c07-6c8620868c18 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:11:48.217 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 3.722s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:48.223 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] VBD OpaqueRef:d5a5630f-9b69-5316-1c07-6c8620868c18 plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:11:48.344 WARNING nova.virt.configdrive [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:11:48.345 DEBUG nova.objects.instance [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lazy-loading `ec2_ids' on Instance uuid 573af27d-723a-48c5-9a9a-6a818abbbfc4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:11:48.388 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Running cmd (subprocess): genisoimage -o /tmp/tmpjVip7G/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpGVtMVZ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:48.953 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] CMD "genisoimage -o /tmp/tmpjVip7G/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpGVtMVZ" returned: 0 in 0.564s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:48.959 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpjVip7G/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:50.247 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:11:50.592 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpK_cFQO/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 19.115s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:50.595 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:11:51.105 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.888s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:51.119 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:11:51.121 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:83d31aa2-49e6-94a5-9ec6-b378cb4e4022, VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:51.135 DEBUG nova.virt.xenapi.vm_utils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:1ab340ad-2926-06f7-7b7d-3c65326fa882 for VM OpaqueRef:83d31aa2-49e6-94a5-9ec6-b378cb4e4022, VDI OpaqueRef:a0675d3d-9f78-99d6-105a-530634f6f74a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:51.137 DEBUG nova.objects.instance [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:11:51.330 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:51.595 DEBUG oslo_concurrency.processutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.001s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:11:51.610 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Destroying VBD for VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:11:51.617 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:52.016 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:52.016 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:52.019 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:52.029 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:52.030 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting hostname (tempest-server-395672701) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:11:52.031 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:52.061 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.030s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:52.062 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:11:52.063 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:52.715 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_nwinfo" :: held 0.652s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:52.715 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:53.308 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:11:53.339 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:11:53.347 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VIF OpaqueRef:aeaf77b2-ed2a-08e6-2744-444f95296166, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:11:53.348 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:53.680 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:11:53.862 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.245s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:53.877 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Destroying VBD for VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:11:53.889 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:427e9846-4db6-2d5e-2171-2aca976e093d, VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:11:53.926 DEBUG nova.virt.xenapi.vm_utils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:4c109cbc-cea3-3cbe-a400-853ccb413c84 for VM OpaqueRef:427e9846-4db6-2d5e-2171-2aca976e093d, VDI OpaqueRef:decd98cc-c3fd-af82-fb7c-18330e9ba8e0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:11:53.927 DEBUG nova.objects.instance [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `pci_devices' on Instance uuid 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:11:54.173 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:54.558 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:54.559 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:54.559 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:54.572 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:54.572 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Injecting hostname (tempest.common.compute-instance-361571357) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:11:54.573 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:54.589 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "update_hostname" :: held 0.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:54.590 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:11:54.591 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:11:55.385 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "update_nwinfo" :: held 0.794s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:11:55.386 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:55.723 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:11:55.757 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:11:55.776 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Created VIF OpaqueRef:b71eb014-0982-0657-a831-c45460a78bf7, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:11:55.777 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:11:56.179 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:12:00.480 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.54 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:11.262 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:12.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:12.514 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:15.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:15.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:16.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:16.547 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:12:16.547 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:12:18.566 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:18.567 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:12:21.315 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpjVip7G/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 32.356s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:12:21.678 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:12:22.406 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:23.071 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 4.505s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:23.486 DEBUG oslo_concurrency.processutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.808s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:12:23.487 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Destroying VBD for VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:12:23.488 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:24.536 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:12:24.537 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:12:24.537 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:12:24.538 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:25.347 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:12:25.348 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:12:25.858 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:12:25.859 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.321s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:25.859 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:25.860 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:25.861 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:25.861 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:12:26.018 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:12:26.373 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:57:17', 'active': False, 'type': u'bridge', 'id': u'74506357-e4e2-4a19-b942-18e58c964d1e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:12:26.405 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:12:26.405 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:12:26.406 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:27.404 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:27.405 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:27.406 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:12:27.406 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:12:27.411 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 45.11 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:28.662 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 5.174s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:28.752 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Destroying VBD for VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:12:28.753 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Creating disk-type VBD for VM OpaqueRef:ecd36add-349e-2b17-bbc3-2a72ba6c7242, VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:12:28.767 DEBUG nova.virt.xenapi.vm_utils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Created VBD OpaqueRef:387cecc0-72f3-c442-a2f8-421fa877fac5 for VM OpaqueRef:ecd36add-349e-2b17-bbc3-2a72ba6c7242, VDI OpaqueRef:42e38493-de1f-9e37-cc09-a06a092b0945. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:12:28.768 DEBUG nova.objects.instance [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lazy-loading `pci_devices' on Instance uuid 573af27d-723a-48c5-9a9a-6a818abbbfc4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:12:29.015 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:29.906 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:29.907 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:29.907 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:29.941 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "store_auto_disk_config" :: held 0.034s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:29.942 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Injecting hostname (tempest.common.compute-instance-1350473151) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:12:29.943 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:29.965 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "update_hostname" :: held 0.022s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:29.965 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:12:29.966 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:30.517 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:31.107 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "update_nwinfo" :: held 1.141s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:31.108 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:31.511 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:12:31.587 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:12:31.605 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Created VIF OpaqueRef:dc022c37-4495-14a0-2968-713bfdc7300d, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:12:31.606 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:32.019 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:12:42.848 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.43 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:44.532 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:12:44.670 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:45.665 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:12:45.665 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:12:45.666 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:45.712 DEBUG oslo_concurrency.lockutils [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.045s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:45.712 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:46.099 DEBUG nova.virt.xenapi.vmops [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:46.476 DEBUG nova.compute.manager [req-1f5c1b86-d30b-4859-a6d6-d48a0890856f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:12:50.078 INFO nova.compute.manager [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Rebuilding instance 2015-08-07 17:12:50.870 DEBUG nova.compute.manager [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:12:52.026 INFO nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VM 2015-08-07 17:12:52.028 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.25 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:12:52.389 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:12:56.898 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:12:57.231 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:12:59.079 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:12:59.080 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:12:59.080 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:12:59.146 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "update_hostname" :: held 0.066s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:12:59.300 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:00.114 DEBUG nova.virt.xenapi.vmops [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:00.363 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:00.506 DEBUG nova.compute.manager [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:13:01.208 DEBUG oslo_concurrency.lockutils [req-cc5f5472-a52a-4293-ad58-513e3b523655 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "_locked_do_build_and_run_instance" :: held 122.289s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:08.815 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:13:09.014 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI f6870d0a-b43a-496c-83b4-619204317b67 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:13:09.082 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 5146533d-cbff-4759-8b20-b57b735ef00f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:13:09.162 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:09.721 INFO nova.compute.manager [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Starting instance... 2015-08-07 17:13:10.949 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.33 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:10.967 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:10.968 DEBUG nova.compute.resource_tracker [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:13:10.976 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:13:10.976 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:13:10.977 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:13:10.977 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:13:10.978 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] disk limit not specified, defaulting to unlimited 2015-08-07 17:13:11.005 DEBUG nova.compute.resources.vcpu [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:13:11.006 DEBUG nova.compute.resources.vcpu [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:13:11.006 INFO nova.compute.claims [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Claim successful 2015-08-07 17:13:12.304 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "instance_claim" :: held 1.337s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:12.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:12.519 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:13.038 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:13.200 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "update_usage" :: held 0.162s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:13.201 DEBUG nova.compute.utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:13:13.205 13318 DEBUG nova.compute.manager [-] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:13:13.206 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-696c794e-e4d8-4bd9-8b55-447342cd9b8f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:13:14.412 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:13:14.541 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:13:15.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:15.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:15.697 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:15.916 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:13:15.920 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:13:16.042 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:13:16.042 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:16.051 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:13:16.052 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:16.550 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:13:16.686 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:13:16.695 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:17.705 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:17.706 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:13:17.707 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:13:18.402 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:13:18.403 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:13:18.403 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:13:18.404 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:13:20.584 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:13:20.710 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:13:20.710 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:13:20.711 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:21.545 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:21.546 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:21.546 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:13:21.547 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:21.578 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:13:21.579 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:13:22.800 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.48 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:24.419 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:24.420 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:13:28.609 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:81cbfe45-8a45-75c6-89f4-bd49e5c32cae from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:13:29.688 13318 DEBUG nova.network.base_api [-] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7a:43:2d', 'active': False, 'type': u'bridge', 'id': u'0273154a-6ab5-4430-8767-952ab6f4586e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:13:29.716 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-696c794e-e4d8-4bd9-8b55-447342cd9b8f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:13:29.846 13318 DEBUG nova.compute.manager [-] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7a:43:2d', 'active': False, 'type': u'bridge', 'id': u'0273154a-6ab5-4430-8767-952ab6f4586e', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:13:30.021 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:13:30.210 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:30.336 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:30.584 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:13:30.585 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:13:30.585 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:30.634 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "xenstore-573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "update_hostname" :: held 0.049s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:30.635 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:31.200 DEBUG nova.virt.xenapi.vmops [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:31.636 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 7.217s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:31.726 DEBUG nova.compute.manager [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:13:31.971 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 15.276s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:31.972 INFO nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 15.42 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:13:31.973 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 15.150s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:32.324 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:13:32.325 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:13:32.325 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:13:32.326 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:32.520 DEBUG oslo_concurrency.lockutils [req-f3c241ec-9501-4b8d-9afe-a07e6a0abecb tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "_locked_do_build_and_run_instance" :: held 143.683s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:33.155 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:13:33.156 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:13:33.377 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:13:33.377 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.051s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:33.378 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:33.378 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:33.379 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:34.542 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:34.543 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:34.543 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:34.545 INFO nova.compute.manager [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Terminating instance 2015-08-07 17:13:34.554 INFO nova.virt.xenapi.vmops [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Destroying VM 2015-08-07 17:13:34.609 DEBUG nova.virt.xenapi.vm_utils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:13:35.962 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Cloned VDI OpaqueRef:28168be7-e7f6-babf-aea2-048e681b2dd3 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:13:37.131 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:38.217 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:38.925 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:13:38.972 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:13:38.973 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:39.357 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:13:39.358 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 34.15 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:39.667 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:e3cb5220-bff1-0e5f-26a5-7b3702331090, VDI OpaqueRef:81cbfe45-8a45-75c6-89f4-bd49e5c32cae ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:13:39.696 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:3a516331-2e99-51c3-b0d7-76bd02d83a86 for VM OpaqueRef:e3cb5220-bff1-0e5f-26a5-7b3702331090, VDI OpaqueRef:81cbfe45-8a45-75c6-89f4-bd49e5c32cae. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:13:39.731 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 7.758s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:39.732 INFO nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Image creation data, cacheable: True, downloaded: False duration: 23.05 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:13:41.027 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.26 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:41.582 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:13:41.654 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:13:41.743 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:d236d593-ada2-c9ab-7a65-c46e622f0afc for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:13:41.744 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:d236d593-ada2-c9ab-7a65-c46e622f0afc ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:13:41.744 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:44.581 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:44.963 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:45.521 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:13:45.549 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:13:45.552 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:13:45.977 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:90151901-ded6-1339-aa04-40a60cfe195f, VDI OpaqueRef:28168be7-e7f6-babf-aea2-048e681b2dd3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:13:46.139 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:1bb26d25-3ec0-7563-7d5b-3b2db236da35 for VM OpaqueRef:90151901-ded6-1339-aa04-40a60cfe195f, VDI OpaqueRef:28168be7-e7f6-babf-aea2-048e681b2dd3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:13:48.292 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:13:48.297 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:13:48.311 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:a638dfb1-e892-d3cc-69bc-d2b10f0e156e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:13:48.313 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Plugging VBD OpaqueRef:a638dfb1-e892-d3cc-69bc-d2b10f0e156e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:13:48.832 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 7.088s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:48.833 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:d236d593-ada2-c9ab-7a65-c46e622f0afc done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:13:48.835 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.521s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:13:48.891 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:d236d593-ada2-c9ab-7a65-c46e622f0afc plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:13:49.002 WARNING nova.virt.configdrive [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:13:49.003 DEBUG nova.objects.instance [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:13:49.063 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmpJjO3fF/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpy1VKIe execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:13:49.536 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmpJjO3fF/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpy1VKIe" returned: 0 in 0.472s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:13:49.542 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJjO3fF/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:13:50.445 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:13:52.633 DEBUG nova.virt.xenapi.vmops [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:13:52.671 DEBUG nova.virt.xenapi.vm_utils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] VDI 72861e9a-4df7-49f5-a376-9bf00102ec1e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:13:52.686 DEBUG nova.virt.xenapi.vm_utils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] VDI aceac30d-9e9d-475f-b70e-72b088fe490c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:13:54.355 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 5.520s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:13:54.356 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Plugging VBD OpaqueRef:a638dfb1-e892-d3cc-69bc-d2b10f0e156e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:13:54.360 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VBD OpaqueRef:a638dfb1-e892-d3cc-69bc-d2b10f0e156e plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:13:54.543 WARNING nova.virt.configdrive [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:13:54.544 DEBUG nova.objects.instance [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `ec2_ids' on Instance uuid 696c794e-e4d8-4bd9-8b55-447342cd9b8f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:13:54.604 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): genisoimage -o /tmp/tmpvRcIlj/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpxZC_k9 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:13:55.048 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "genisoimage -o /tmp/tmpvRcIlj/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpxZC_k9" returned: 0 in 0.444s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:13:55.066 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpvRcIlj/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:13:56.445 DEBUG nova.virt.xenapi.vmops [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:13:56.492 DEBUG nova.virt.xenapi.vm_utils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:13:56.493 DEBUG nova.compute.manager [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:13:59.764 DEBUG nova.compute.manager [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:11:08Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=13,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=573af27d-723a-48c5-9a9a-6a818abbbfc4,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:11:12Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:14:00.061 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:00.062 DEBUG nova.objects.instance [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lazy-loading `numa_topology' on Instance uuid 573af27d-723a-48c5-9a9a-6a818abbbfc4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:14:00.279 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "compute_resources" released by "update_usage" :: held 0.218s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:00.494 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:01.197 DEBUG oslo_concurrency.lockutils [req-9f87b795-cfd8-455f-894d-1b84153b0a84 tempest-FloatingIPsNegativeTestJSON-45780709 tempest-FloatingIPsNegativeTestJSON-486082406] Lock "573af27d-723a-48c5-9a9a-6a818abbbfc4" released by "do_terminate_instance" :: held 26.655s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:10.100 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJjO3fF/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 20.559s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:14:10.102 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:14:10.473 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:11.186 DEBUG oslo_concurrency.processutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.084s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:14:11.188 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:14:11.189 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:13.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:13.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:14.047 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.858s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:14.055 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:14:14.056 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:e3cb5220-bff1-0e5f-26a5-7b3702331090, VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:14:14.067 DEBUG nova.virt.xenapi.vm_utils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:b8bc9cb0-f9a2-02b3-7a82-6ce5f8e7e830 for VM OpaqueRef:e3cb5220-bff1-0e5f-26a5-7b3702331090, VDI OpaqueRef:f5fb33b9-3f48-c5be-1d81-84195ffcee25. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:14:14.069 DEBUG nova.objects.instance [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:14:14.233 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:14.620 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:14.625 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_meta" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:14.626 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:14.654 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "store_auto_disk_config" :: held 0.029s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:14.655 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting hostname (tempest-server-395672701) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:14:14.656 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:14.670 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:14.671 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:14:14.672 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:15.070 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_nwinfo" :: held 0.398s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:15.088 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:15.771 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:14:15.824 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:14:15.835 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Created VIF OpaqueRef:f7720c7f-1764-a7d5-a5a9-7d0cbe5335a5, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:14:15.836 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:16.017 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpvRcIlj/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 20.951s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:14:16.019 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:14:16.558 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:14:17.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:17.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:17.514 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:14:17.923 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:14:18.150 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:14:18.187 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:14:18.188 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:14:18.188 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:18.321 DEBUG oslo_concurrency.processutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 2.302s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:14:18.322 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Destroying VBD for VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:14:18.323 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:19.189 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:19.212 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:19.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:19.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:14:19.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:19.555 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:14:19.556 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:14:20.391 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:20.391 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:14:20.929 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.45 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:22.293 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.970s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:22.302 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Destroying VBD for VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:14:22.303 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Creating disk-type VBD for VM OpaqueRef:90151901-ded6-1339-aa04-40a60cfe195f, VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:14:22.504 DEBUG nova.virt.xenapi.vm_utils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Created VBD OpaqueRef:c8fe23bd-0e93-b22c-a494-6a9f54e21f46 for VM OpaqueRef:90151901-ded6-1339-aa04-40a60cfe195f, VDI OpaqueRef:b13adf5b-b639-5e5d-d282-b40ecc7cc61f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:14:22.505 DEBUG nova.objects.instance [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `pci_devices' on Instance uuid 696c794e-e4d8-4bd9-8b55-447342cd9b8f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:14:22.695 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:24.027 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 3.636s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:25.566 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:14:25.566 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:14:25.567 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:14:25.567 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:25.900 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:25.901 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:25.901 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:25.961 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "store_auto_disk_config" :: held 0.060s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:25.962 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Injecting hostname (tempest-floating-server-45636288) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:14:25.963 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:25.991 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "update_hostname" :: held 0.028s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:25.992 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:14:25.992 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:26.301 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:14:26.302 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:14:26.850 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:14:26.851 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.283s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:26.851 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:26.852 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:26.853 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:27.121 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "update_nwinfo" :: held 1.128s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:27.172 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:27.774 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:14:27.783 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:14:27.820 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Created VIF OpaqueRef:34d5b407-cfef-53bb-744a-ab311a76d6a3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:14:27.821 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:27.829 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:28.277 INFO nova.compute.manager [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Starting instance... 2015-08-07 17:14:28.559 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:14:28.950 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:28.956 DEBUG nova.compute.resource_tracker [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:14:28.972 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:14:28.973 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:14:28.978 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:14:28.979 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:14:28.979 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] disk limit not specified, defaulting to unlimited 2015-08-07 17:14:29.017 DEBUG nova.compute.resources.vcpu [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:14:29.018 DEBUG nova.compute.resources.vcpu [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:14:29.018 INFO nova.compute.claims [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Claim successful 2015-08-07 17:14:30.049 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" released by "instance_claim" :: held 1.099s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:30.384 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:30.507 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:30.593 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" released by "update_usage" :: held 0.208s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:14:30.594 DEBUG nova.compute.utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:14:30.598 13318 DEBUG nova.compute.manager [-] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:14:30.599 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:14:31.680 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:14:31.741 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:14:31.742 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:14:32.531 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:14:32.618 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:14:35.866 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:14:35.867 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 39.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:51.625 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 1.21 sec 2015-08-07 17:14:51.625 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:58.025 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 3.60 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:14:58.284 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Cloned VDI OpaqueRef:144b4469-2ad9-01c2-3da4-68de31921d41 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:14:58.309 13318 DEBUG nova.network.base_api [-] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c4:98:2e', 'active': False, 'type': u'bridge', 'id': u'f47e1752-8095-421c-89c4-2777a28c48cf', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:14:58.342 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:14:58.342 13318 DEBUG nova.compute.manager [-] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c4:98:2e', 'active': False, 'type': u'bridge', 'id': u'f47e1752-8095-421c-89c4-2777a28c48cf', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:15:02.708 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:03.684 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 31.065s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:15:03.684 INFO nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Image creation data, cacheable: True, downloaded: False duration: 31.15 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:15:14.751 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:15.551 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:15.552 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:16.618 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:15:17.050 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:15:17.392 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:15:17.404 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:15:17.405 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:15:17.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:17.624 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:17.883 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Creating disk-type VBD for VM OpaqueRef:c0d0bc63-982f-0116-d5f4-80ba47173e4b, VDI OpaqueRef:144b4469-2ad9-01c2-3da4-68de31921d41 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:15:17.923 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Created VBD OpaqueRef:b385b644-d6ab-a6e3-ac7f-7e20f165bcca for VM OpaqueRef:c0d0bc63-982f-0116-d5f4-80ba47173e4b, VDI OpaqueRef:144b4469-2ad9-01c2-3da4-68de31921d41. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:15:18.506 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:18.618 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:18.626 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:18.627 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:19.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:19.513 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:15:19.697 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:15:20.116 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:57:17', 'active': False, 'type': u'bridge', 'id': u'74506357-e4e2-4a19-b942-18e58c964d1e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:15:20.289 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:15:20.290 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:15:20.290 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:21.088 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Created VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:15:21.104 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:15:21.306 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:21.307 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:15:21.385 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Created VBD OpaqueRef:c13c999d-c3b3-2b42-c2d5-b47cc24eaad1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:15:21.386 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Plugging VBD OpaqueRef:c13c999d-c3b3-2b42-c2d5-b47cc24eaad1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:15:21.387 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:15:21.915 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 3 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:15:21.916 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 573af27d-723a-48c5-9a9a-6a818abbbfc4] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:15:22.417 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.24 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:23.044 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: abd853db-1da7-45ad-a21f-323100b6d158] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:15:24.164 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: d2671615-627a-4730-bfd6-887d04047dda] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:15:24.763 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:24.764 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.21 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:24.970 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:24.971 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:15:24.972 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:25.010 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:15:25.012 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:15:27.131 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:15:27.132 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:15:30.227 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 3.096s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:15:32.361 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:15:32.361 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:15:32.362 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:15:32.362 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:15:34.878 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:15:34.878 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:15:35.584 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.61 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:36.463 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:15:36.463 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 4.101s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:15:36.464 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:36.464 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:36.465 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 17:15:49.043 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 27.656s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:15:49.044 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Plugging VBD OpaqueRef:c13c999d-c3b3-2b42-c2d5-b47cc24eaad1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:15:49.073 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] VBD OpaqueRef:c13c999d-c3b3-2b42-c2d5-b47cc24eaad1 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:15:49.315 WARNING nova.virt.configdrive [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:15:49.316 DEBUG nova.objects.instance [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lazy-loading `ec2_ids' on Instance uuid 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:15:49.618 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Running cmd (subprocess): genisoimage -o /tmp/tmpIJLu3G/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpwuJFGD execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:15:52.892 ERROR oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Error during ComputeManager._poll_bandwidth_usage 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py", line 218, in run_periodic_tasks 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task task(self, context) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/compute/manager.py", line 5680, in _poll_bandwidth_usage 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task update_cells=update_cells) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 195, in wrapper 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task ctxt, self, fn.__name__, args, kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/rpcapi.py", line 248, in object_action 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task objmethod=objmethod, args=args, kwargs=kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task retry=self.retry) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task timeout=timeout, retry=retry) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 431, in send 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task retry=retry) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 422, in _send 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task raise result 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/manager.py", line 442, in _object_dispatch 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task return getattr(target, method)(*args, **kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 211, in wrapper 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task return fn(self, *args, **kwargs) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 69, in create 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task self._from_db_object(self._context, self, db_bw_usage) 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 42, in _from_db_object 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task bw_usage[field] = db_bw_usage['uuid'] 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.892 13318 ERROR oslo_service.periodic_task 2015-08-07 17:15:52.914 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 0.17 sec 2015-08-07 17:15:52.914 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:52.915 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.54 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:53.600 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] CMD "genisoimage -o /tmp/tmpIJLu3G/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpwuJFGD" returned: 0 in 3.982s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:15:53.603 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpIJLu3G/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:15:55.635 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.28 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:15:57.573 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:15:57.575 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:05.488 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:07.978 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 6 instances in the database and 4 instances on the hypervisor. 2015-08-07 17:16:07.979 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.979 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.980 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 091727e3-644b-4029-98ad-5a102868d2d5 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.980 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.980 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 696c794e-e4d8-4bd9-8b55-447342cd9b8f _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.981 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:16:07.982 13318 DEBUG oslo_concurrency.lockutils [-] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:07.982 13318 INFO nova.compute.manager [-] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] During sync_power_state the instance has a pending task (rebuild_spawning). Skip. 2015-08-07 17:16:07.983 13318 DEBUG oslo_concurrency.lockutils [-] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:07.983 13318 DEBUG oslo_concurrency.lockutils [-] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:07.986 13318 DEBUG oslo_concurrency.lockutils [-] Lock "091727e3-644b-4029-98ad-5a102868d2d5" acquired by "query_driver_power_state_and_sync" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:07.987 13318 DEBUG oslo_concurrency.lockutils [-] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:07.988 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 11.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:09.510 13318 DEBUG oslo_concurrency.lockutils [-] Lock "091727e3-644b-4029-98ad-5a102868d2d5" released by "query_driver_power_state_and_sync" :: held 1.523s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:09.653 13318 DEBUG oslo_concurrency.lockutils [-] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "query_driver_power_state_and_sync" :: held 1.670s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:09.755 13318 DEBUG oslo_concurrency.lockutils [-] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "query_driver_power_state_and_sync" :: held 1.767s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:15.212 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 2.29 sec 2015-08-07 17:16:15.213 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:17.273 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:16:17.857 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:18.395 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:19.013 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:19.014 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:19.015 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:19.095 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.42 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:19.100 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:16:19.101 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:16:19.102 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:19.116 DEBUG oslo_concurrency.lockutils [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "update_hostname" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:19.116 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:19.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:19.514 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:16:19.727 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:16:19.956 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:ca:df', 'active': False, 'type': u'bridge', 'id': u'3b0bca93-d9b8-4771-a84a-cc726fe3e3be', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:16:19.989 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:16:19.990 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:16:19.990 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:20.493 DEBUG nova.virt.xenapi.vmops [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:21.434 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:16:21.516 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:21.905 DEBUG nova.compute.manager [req-bd17772b-a4a6-46cb-97cb-6248f899fb43 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:16:21.990 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:21.991 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:16:21.992 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:22.305 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:16:22.305 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:16:22.306 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:22.312 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "xenstore-696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:22.313 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:22.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:22.594 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:16:22.594 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:16:22.985 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:23.352 INFO nova.compute.manager [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Starting instance... 2015-08-07 17:16:23.384 DEBUG nova.virt.xenapi.vmops [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:23.552 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:23.573 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:16:23.764 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:23.764 DEBUG nova.compute.resource_tracker [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:16:23.773 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:16:23.774 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Total memory: 8187 MB, used: 926.00 MB 2015-08-07 17:16:23.774 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] memory limit: 12280.50 MB, free: 11354.50 MB 2015-08-07 17:16:23.775 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:16:23.775 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] disk limit not specified, defaulting to unlimited 2015-08-07 17:16:23.800 DEBUG nova.compute.resources.vcpu [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:16:23.800 DEBUG nova.compute.resources.vcpu [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:16:23.801 INFO nova.compute.claims [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Claim successful 2015-08-07 17:16:23.946 DEBUG nova.compute.manager [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:16:25.135 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "instance_claim" :: held 1.371s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:25.146 DEBUG oslo_concurrency.lockutils [req-997cb2a8-d266-4f61-b877-adf05fb001d8 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "_locked_do_build_and_run_instance" :: held 195.983s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:25.147 13318 DEBUG oslo_concurrency.lockutils [-] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "query_driver_power_state_and_sync" :: waited 17.159s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:25.148 13318 INFO nova.compute.manager [-] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:16:25.148 13318 DEBUG oslo_concurrency.lockutils [-] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:25.321 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:25.841 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.289s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:25.894 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:26.333 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.439s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:26.334 DEBUG nova.compute.utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:16:26.338 13318 DEBUG nova.compute.manager [-] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:16:26.339 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-bcf8fed4-5825-4f82-a05c-9338adad0cda" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:16:26.991 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -3 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:16:26.992 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:16:26.993 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-3 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:16:26.994 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:28.434 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 11 2015-08-07 17:16:28.435 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=995MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=11 pci_stats=None 2015-08-07 17:16:29.200 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:16:29.202 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:16:29.203 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 2.209s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:29.204 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:29.204 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:29.206 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:29.225 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:16:29.226 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:29.675 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:16:29.737 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:37.127 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Cloned VDI OpaqueRef:5eb6e155-f136-03a8-e40e-10365518570b from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:16:37.333 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:38.208 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:16:38.212 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 38.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:44.476 13318 DEBUG nova.network.base_api [-] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:23:ed:ac', 'active': False, 'type': u'bridge', 'id': u'2dbfc884-5fcf-4dd0-8922-a32230655335', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:16:44.515 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-bcf8fed4-5825-4f82-a05c-9338adad0cda" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:16:44.516 13318 DEBUG nova.compute.manager [-] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:23:ed:ac', 'active': False, 'type': u'bridge', 'id': u'2dbfc884-5fcf-4dd0-8922-a32230655335', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:16:45.969 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 16.232s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:45.970 INFO nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Image creation data, cacheable: True, downloaded: False duration: 16.29 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:16:47.287 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:52.573 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpIJLu3G/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 58.970s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:16:52.575 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:16:53.159 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:53.560 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:54.323 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:16:54.350 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:16:54.351 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:16:54.408 DEBUG oslo_concurrency.processutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.833s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:16:54.435 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Destroying VBD for VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:16:54.436 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:54.856 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:bd55c637-4be0-5ece-adf6-3207530069ca, VDI OpaqueRef:5eb6e155-f136-03a8-e40e-10365518570b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:16:54.977 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:0938a19c-2ad1-da25-02ef-5e84c5a786ec for VM OpaqueRef:bd55c637-4be0-5ece-adf6-3207530069ca, VDI OpaqueRef:5eb6e155-f136-03a8-e40e-10365518570b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:16:56.373 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:16:57.678 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:16:58.081 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:16:58.111 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:3b1973f9-319a-a912-7824-1b97fa04692a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:16:58.111 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:3b1973f9-319a-a912-7824-1b97fa04692a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:16:59.919 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 5.483s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:16:59.921 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 1.809s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:16:59.945 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Destroying VBD for VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:16:59.945 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Creating disk-type VBD for VM OpaqueRef:c0d0bc63-982f-0116-d5f4-80ba47173e4b, VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:16:59.969 DEBUG nova.virt.xenapi.vm_utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Created VBD OpaqueRef:28f81100-d3e7-8769-0f03-3baabc47d740 for VM OpaqueRef:c0d0bc63-982f-0116-d5f4-80ba47173e4b, VDI OpaqueRef:a59084b2-5cf0-d43d-504a-f03a4ed4232f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:16:59.970 DEBUG nova.objects.instance [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lazy-loading `pci_devices' on Instance uuid 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:17:00.240 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:17:00.769 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:00.770 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:00.770 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:00.785 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "store_auto_disk_config" :: held 0.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:00.786 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Injecting hostname (tempest.common.compute-instance-1428168797) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:17:00.787 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:00.819 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "update_hostname" :: held 0.032s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:00.820 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:17:00.820 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:01.346 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "update_nwinfo" :: held 0.525s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:01.347 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:17:01.668 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:17:01.702 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:17:01.710 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Created VIF OpaqueRef:92767f2b-8bd3-0bb1-54ce-938c81bc4d6a, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:17:01.712 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:17:06.337 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:17:06.949 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.40 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:16.251 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.10 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:16.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:16.554 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:16.596 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 16.675s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:16.597 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Plugging VBD OpaqueRef:3b1973f9-319a-a912-7824-1b97fa04692a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:17:16.608 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VBD OpaqueRef:3b1973f9-319a-a912-7824-1b97fa04692a plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:17:16.718 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:16.719 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:16.719 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:16.722 INFO nova.compute.manager [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Terminating instance 2015-08-07 17:17:16.724 INFO nova.virt.xenapi.vmops [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Destroying VM 2015-08-07 17:17:16.790 DEBUG nova.virt.xenapi.vm_utils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:17:16.826 WARNING nova.virt.configdrive [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:17:16.845 DEBUG nova.objects.instance [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `ec2_ids' on Instance uuid bcf8fed4-5825-4f82-a05c-9338adad0cda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:17:17.000 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): genisoimage -o /tmp/tmpNd1Dn8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpHVvSis execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:17:19.978 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:19.981 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:19.982 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:17:19.982 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:17:20.708 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:17:21.871 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:17:21.871 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:17:21.872 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:17:21.874 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:17:21.874 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:17:21.877 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "genisoimage -o /tmp/tmpNd1Dn8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpHVvSis" returned: 0 in 4.877s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:17:21.881 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpNd1Dn8/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:17:24.921 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:61:5b', 'active': False, 'type': u'bridge', 'id': u'1d3147dc-aab6-4dd3-a93a-a902661ccb6b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:17:25.056 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:17:25.057 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:17:25.058 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:25.272 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.53 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:25.815 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:25.822 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:17:25.822 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:26.043 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:17:26.044 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:17:28.435 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:29.164 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:29.164 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:17:30.038 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:30.040 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:30.042 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f-events" released by "_clear_events" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:30.044 INFO nova.compute.manager [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Terminating instance 2015-08-07 17:17:30.046 INFO nova.virt.xenapi.vmops [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Destroying VM 2015-08-07 17:17:30.066 DEBUG nova.virt.xenapi.vm_utils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:17:34.373 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 5.209s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:35.058 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -3 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:17:35.059 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:17:35.059 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=587MB free_disk=16GB free_vcpus=-3 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:17:35.060 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:17:35.608 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.75 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:35.965 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 11 2015-08-07 17:17:35.965 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=995MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=11 pci_stats=None 2015-08-07 17:17:36.522 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:17:36.523 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.463s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:17:36.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:36.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:36.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:36.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.70 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:43.237 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:17:43.240 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 34.27 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:46.302 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.06 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:17:57.323 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.04 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:01.909 DEBUG nova.virt.xenapi.vmops [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:18:02.150 DEBUG nova.virt.xenapi.vm_utils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VDI 20021ae3-8607-41ab-98a7-b3dc860c7f34 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:18:02.195 DEBUG nova.virt.xenapi.vm_utils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VDI c15489af-df57-40e9-8b98-4ddef1b67d78 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:18:11.944 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 3.42 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:16.398 DEBUG nova.virt.xenapi.vmops [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:18:16.503 DEBUG nova.virt.xenapi.vm_utils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:18:16.504 DEBUG nova.compute.manager [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:18:16.580 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.79 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:17.591 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:17.597 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:19.307 DEBUG nova.virt.xenapi.vmops [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:18:19.779 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:19.779 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:18:20.032 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:18:20.138 DEBUG nova.virt.xenapi.vm_utils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VDI 2fe94b9d-303a-4b8b-805e-bb1d9ef20482 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:18:20.150 DEBUG nova.virt.xenapi.vm_utils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] VDI fbc38591-6b51-43fb-aba5-57375366c1db is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:18:20.564 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:18:20.591 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:18:20.592 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:18:20.593 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.73 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:22.327 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:22.334 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:22.334 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.17 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:24.365 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpNd1Dn8/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 62.484s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:18:24.367 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:18:25.263 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:25.265 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:18:25.266 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:25.267 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.25 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:25.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:25.594 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:18:25.602 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:18:25.737 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:25.984 DEBUG nova.compute.manager [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:10:58Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=12,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=2af2868e-6009-4d8b-b3ee-90b4d32a3ddd,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:11:01Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:18:27.119 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:27.120 DEBUG nova.objects.instance [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `numa_topology' on Instance uuid 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:18:28.201 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:28.201 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:18:28.203 DEBUG nova.virt.xenapi.vmops [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:18:28.311 DEBUG nova.virt.xenapi.vm_utils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:18:28.312 DEBUG nova.compute.manager [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:18:29.684 DEBUG oslo_concurrency.processutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 5.316s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:18:29.685 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:18:29.685 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:30.716 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "update_usage" :: held 3.598s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:36.937 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:38.583 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 10.382s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:40.986 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:18:42.789 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:45.252 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:18:45.745 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:18:45.746 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:18:45.746 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:47.055 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:47.579 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:18:47.580 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:18:50.733 DEBUG oslo_concurrency.lockutils [req-fea2742a-fb30-44a0-9947-f5086b1faa4a tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "2af2868e-6009-4d8b-b3ee-90b4d32a3ddd" released by "do_terminate_instance" :: held 94.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:51.639 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:18:51.646 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 5.893s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:51.661 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:18:51.669 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:18:51.669 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:18:51.670 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:51.673 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:51.823 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "xenstore-2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "update_hostname" :: held 0.153s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:51.826 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:52.317 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 22.629s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:52.377 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Destroying VBD for VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:18:52.378 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Creating disk-type VBD for VM OpaqueRef:bd55c637-4be0-5ece-adf6-3207530069ca, VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:18:52.398 DEBUG nova.virt.xenapi.vm_utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Created VBD OpaqueRef:e265aaaf-7eb3-7850-7d2d-78356b1f47d3 for VM OpaqueRef:bd55c637-4be0-5ece-adf6-3207530069ca, VDI OpaqueRef:c2276286-2ef5-8fa5-546c-73c14a64fe82. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:18:52.399 DEBUG nova.objects.instance [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `pci_devices' on Instance uuid bcf8fed4-5825-4f82-a05c-9338adad0cda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:18:52.615 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:52.638 DEBUG nova.virt.xenapi.vmops [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:52.714 DEBUG nova.compute.manager [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:13:08Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=14,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=696c794e-e4d8-4bd9-8b55-447342cd9b8f,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:13:13Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:18:53.099 DEBUG nova.compute.manager [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:18:53.448 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:53.449 DEBUG nova.objects.instance [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lazy-loading `numa_topology' on Instance uuid 696c794e-e4d8-4bd9-8b55-447342cd9b8f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:18:53.598 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:53.598 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:53.599 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:53.625 DEBUG nova.compute.utils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Unexpected task state: expecting (u'spawning',) but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:18:53.628 DEBUG nova.compute.manager [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:18:53.628 DEBUG nova.compute.manager [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:18:53.655 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "store_auto_disk_config" :: held 0.056s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:53.655 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Injecting hostname (tempest.common.compute-instance-1631278566) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:18:53.656 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:53.669 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:53.670 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:18:53.671 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:53.742 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "compute_resources" released by "update_usage" :: held 0.294s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:54.966 DEBUG oslo_concurrency.lockutils [req-68a8691e-014c-4202-84ea-6964ef9f03a2 tempest-FloatingIPsTestJSON-2012724132 tempest-FloatingIPsTestJSON-337032006] Lock "696c794e-e4d8-4bd9-8b55-447342cd9b8f" released by "do_terminate_instance" :: held 84.927s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:55.356 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "update_nwinfo" :: held 1.685s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:55.357 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:56.259 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.49 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:18:56.358 DEBUG oslo_concurrency.lockutils [req-a37011a3-c199-4a6c-94e5-0eca9e14fba0 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "_locked_do_build_and_run_instance" :: held 268.527s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:56.379 13318 DEBUG oslo_concurrency.lockutils [-] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "query_driver_power_state_and_sync" :: waited 168.390s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:56.379 13318 INFO nova.compute.manager [-] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:18:56.380 13318 DEBUG oslo_concurrency.lockutils [-] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:56.380 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" acquired by "do_terminate_instance" :: waited 64.258s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:56.381 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:18:56.382 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:18:56.384 INFO nova.compute.manager [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Terminating instance 2015-08-07 17:18:56.386 INFO nova.virt.xenapi.vmops [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Destroying VM 2015-08-07 17:18:56.401 DEBUG nova.virt.xenapi.vm_utils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:18:56.415 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:18:56.431 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:18:56.439 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Created VIF OpaqueRef:2dfe4afe-18d9-e01e-290b-6cf668773777, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:18:56.440 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:18:57.011 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:19:00.826 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:00.831 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 17.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:08.989 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:16.391 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.43 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:18.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:18.538 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:19.639 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:19.640 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:19:19.897 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:19:20.211 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:57:17', 'active': False, 'type': u'bridge', 'id': u'74506357-e4e2-4a19-b942-18e58c964d1e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:19:20.263 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-091727e3-644b-4029-98ad-5a102868d2d5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:19:20.264 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:19:20.264 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:20.439 DEBUG nova.virt.xenapi.vmops [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:19:20.479 DEBUG nova.virt.xenapi.vm_utils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] VDI 8100d11e-b809-4953-8243-d3592ae1c484 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:19:20.488 DEBUG nova.virt.xenapi.vm_utils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] VDI 545496a7-1ed0-46bc-a447-f67d5c19cb81 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:19:21.497 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:21.746 INFO nova.compute.manager [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Starting instance... 2015-08-07 17:19:22.139 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:22.140 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.37 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:22.324 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:22.324 DEBUG nova.compute.resource_tracker [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:19:22.331 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:19:22.342 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:19:22.343 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:19:22.345 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:19:22.345 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] disk limit not specified, defaulting to unlimited 2015-08-07 17:19:22.391 DEBUG nova.compute.resources.vcpu [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:19:22.391 DEBUG nova.compute.resources.vcpu [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:19:22.402 INFO nova.compute.claims [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Claim successful 2015-08-07 17:19:22.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:22.621 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:22.631 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:22.830 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:23.240 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "instance_claim" :: held 0.916s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:23.530 DEBUG nova.virt.xenapi.vmops [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:19:23.599 DEBUG nova.virt.xenapi.vm_utils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:19:23.621 DEBUG nova.compute.manager [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:19:23.640 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:23.769 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:23.959 INFO nova.compute.manager [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Starting instance... 2015-08-07 17:19:24.390 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "update_usage" :: held 0.622s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:24.391 DEBUG nova.compute.utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:19:24.396 13318 DEBUG nova.compute.manager [-] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:19:24.398 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-45057175-41ca-4ad9-96c4-36ae4b86e6d4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:19:25.378 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:25.378 DEBUG nova.compute.resource_tracker [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:19:25.436 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:19:25.446 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Total memory: 8187 MB, used: 926.00 MB 2015-08-07 17:19:25.447 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] memory limit: 12280.50 MB, free: 11354.50 MB 2015-08-07 17:19:25.447 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:19:25.448 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] disk limit not specified, defaulting to unlimited 2015-08-07 17:19:25.507 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:25.507 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:25.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:25.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:19:25.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:25.525 DEBUG nova.compute.resources.vcpu [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:19:25.526 DEBUG nova.compute.resources.vcpu [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:19:25.526 INFO nova.compute.claims [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Claim successful 2015-08-07 17:19:26.073 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.75 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:27.051 DEBUG nova.compute.manager [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:14:26Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=15,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=2d2f2690-ae7c-4e46-a28c-771da8b3c2b7,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:14:30Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:19:27.074 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:19:27.120 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:19:27.132 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:19:27.176 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "instance_claim" :: held 1.799s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:27.485 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:27.486 DEBUG nova.objects.instance [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lazy-loading `numa_topology' on Instance uuid 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:19:27.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:19:27.778 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:19:27.779 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:19:27.784 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:19:27.787 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "compute_resources" released by "update_usage" :: held 0.303s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:27.799 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:27.877 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:29.305 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "update_usage" :: held 1.506s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:29.306 DEBUG nova.compute.utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:19:29.318 13318 DEBUG nova.compute.manager [-] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:19:29.319 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-4bb27132-2638-467d-a5be-49e1ad01b113" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:19:31.986 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:31.987 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:19:34.721 DEBUG oslo_concurrency.lockutils [req-d97af960-807e-4c24-ab19-49422079e524 tempest-ImagesTestJSON-351376176 tempest-ImagesTestJSON-223628044] Lock "2d2f2690-ae7c-4e46-a28c-771da8b3c2b7" released by "do_terminate_instance" :: held 38.340s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:36.155 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:19:36.209 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:19:36.210 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:19:36.609 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.22 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:37.018 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:19:45.056 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Cloned VDI OpaqueRef:3183f0c2-15db-48b1-08a2-db7d0166260c from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:19:47.479 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.35 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:49.116 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:19:49.457 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:19:53.422 13318 DEBUG nova.network.base_api [-] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:59:03:03', 'active': False, 'type': u'bridge', 'id': u'4791b56d-604f-43e6-a8a4-cb60e88b5ddb', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:19:53.659 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:19:53.659 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:19:53.660 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:53.722 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-45057175-41ca-4ad9-96c4-36ae4b86e6d4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:19:54.068 13318 DEBUG nova.compute.manager [-] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:59:03:03', 'active': False, 'type': u'bridge', 'id': u'4791b56d-604f-43e6-a8a4-cb60e88b5ddb', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:19:54.089 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "xenstore-bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "update_hostname" :: held 0.429s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:54.118 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:19:54.126 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 22.140s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:56.451 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 28.573s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:19:56.452 INFO nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Image creation data, cacheable: True, downloaded: False duration: 28.67 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:19:56.453 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 18.925s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:58.092 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:19:58.093 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:19:58.093 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:19:58.094 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:19:58.309 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:19:59.257 DEBUG nova.virt.xenapi.vmops [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:19:59.694 13318 DEBUG nova.network.base_api [-] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.10'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:23:80:8d', 'active': False, 'type': u'bridge', 'id': u'aedb05ee-a0d2-415e-a700-7cb1e38f590a', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:19:59.806 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:19:59.807 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:19:59.820 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-4bb27132-2638-467d-a5be-49e1ad01b113" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:19:59.820 13318 DEBUG nova.compute.manager [-] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.10'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:23:80:8d', 'active': False, 'type': u'bridge', 'id': u'aedb05ee-a0d2-415e-a700-7cb1e38f590a', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:20:00.848 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:20:00.849 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 2.755s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:00.853 DEBUG nova.compute.manager [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:20:00.854 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:00.855 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:02.492 DEBUG nova.compute.utils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Unexpected task state: expecting (u'spawning',) but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:20:02.494 DEBUG nova.compute.manager [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:20:02.494 DEBUG nova.compute.manager [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:20:09.867 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:09.874 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:18.632 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:18.633 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:20.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:20.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:20:20.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:20:20.895 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:20:20.896 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:20:20.896 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:20:20.897 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:20:20.898 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:20:20.898 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:20:22.236 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:46:1f:75', 'active': False, 'type': u'bridge', 'id': u'56b2e3c5-379b-4990-a2f4-5ae5440c9c97', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:20:22.410 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-fda3bb4d-ccb2-4500-8c24-8c38815626fa" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:20:22.411 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:20:22.411 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:23.516 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 7.61 sec 2015-08-07 17:20:23.516 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:24.495 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:24.497 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:24.498 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:24.498 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:24.500 INFO nova.compute.manager [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Terminating instance 2015-08-07 17:20:24.502 INFO nova.virt.xenapi.vmops [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VM 2015-08-07 17:20:24.574 DEBUG nova.virt.xenapi.vm_utils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:20:25.439 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:25.450 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:25.451 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.05 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:25.541 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:25.547 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:27.163 DEBUG oslo_concurrency.lockutils [req-f94b166b-b133-4740-a226-9d488b4261bd tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "_locked_do_build_and_run_instance" :: held 244.176s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:27.163 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda" acquired by "do_terminate_instance" :: waited 37.530s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:27.164 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:27.164 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:27.166 INFO nova.compute.manager [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Terminating instance 2015-08-07 17:20:27.168 INFO nova.virt.xenapi.vmops [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Destroying VM 2015-08-07 17:20:27.507 DEBUG nova.virt.xenapi.vm_utils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:20:27.566 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:27.567 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:20:27.567 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:20:27.602 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:20:27.603 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:20:29.705 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Cloned VDI OpaqueRef:21fe3efb-6231-8888-5a8e-a5d43ecf25f3 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:20:31.846 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:33.153 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:20:33.595 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:20:34.266 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:34.267 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:34.267 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:34.269 INFO nova.compute.manager [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Terminating instance 2015-08-07 17:20:34.271 INFO nova.virt.xenapi.vmops [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Destroying VM 2015-08-07 17:20:34.315 DEBUG nova.virt.xenapi.vm_utils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:20:34.767 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:36.660 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 4.814s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:36.828 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:20:37.348 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:37.349 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:37.350 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:37.352 INFO nova.compute.manager [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Terminating instance 2015-08-07 17:20:37.354 INFO nova.virt.xenapi.vmops [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Destroying VM 2015-08-07 17:20:37.621 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:20:37.831 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:20:37.832 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:20:37.832 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:20:37.833 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:37.861 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:20:37.862 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:20:38.292 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 41.838s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:20:38.292 INFO nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Image creation data, cacheable: True, downloaded: False duration: 61.27 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:20:38.295 DEBUG nova.virt.xenapi.vm_utils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:20:39.250 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:f4c40d70-c549-651a-7c25-e04d327051c2, VDI OpaqueRef:3183f0c2-15db-48b1-08a2-db7d0166260c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:20:46.309 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:20:49.145 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:20:49.166 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:9932f480-8491-67dc-abda-1d5b86932d96 for VM OpaqueRef:f4c40d70-c549-651a-7c25-e04d327051c2, VDI OpaqueRef:3183f0c2-15db-48b1-08a2-db7d0166260c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:20:50.938 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:20:51.699 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:20:51.717 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:20:51.723 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:20:51.728 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:412c354d-4c0c-53a5-1703-271055635cc6 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:20:51.729 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Plugging VBD OpaqueRef:412c354d-4c0c-53a5-1703-271055635cc6 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:20:51.742 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:20:57.162 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:11.166 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 2.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:17.573 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:24.708 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.37 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:26.158 DEBUG nova.virt.xenapi.vmops [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:21:26.324 DEBUG nova.virt.xenapi.vm_utils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 149460b0-65cf-41d6-a412-53cdb475ff10 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:26.387 DEBUG nova.virt.xenapi.vm_utils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 4ceba9d1-1faa-488b-b60b-4b26a463f416 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:28.675 DEBUG nova.virt.xenapi.vmops [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:21:28.801 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 37.059s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:28.802 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Plugging VBD OpaqueRef:412c354d-4c0c-53a5-1703-271055635cc6 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:21:28.803 DEBUG nova.virt.xenapi.vm_utils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI a880aeef-7083-44c3-8dfc-fba272d0c3ac is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:28.886 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VBD OpaqueRef:412c354d-4c0c-53a5-1703-271055635cc6 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:21:28.937 DEBUG nova.virt.xenapi.vm_utils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI ed068b91-4e58-437a-a521-dd2b18494d74 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:29.039 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:21:29.040 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:21:29.106 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:21:29.107 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:21:29.119 WARNING nova.virt.configdrive [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:21:29.120 DEBUG nova.objects.instance [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `ec2_ids' on Instance uuid 45057175-41ca-4ad9-96c4-36ae4b86e6d4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:21:29.178 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): genisoimage -o /tmp/tmpESQc5O/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpsDMNOw execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:21:30.303 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "genisoimage -o /tmp/tmpESQc5O/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpsDMNOw" returned: 0 in 1.125s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:21:30.614 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpESQc5O/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:21:31.487 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:21:31.489 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 53.656s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:31.490 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:31.491 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:31.492 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:31.492 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:21:32.176 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 3 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:21:32.177 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d2f2690-ae7c-4e46-a28c-771da8b3c2b7] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:21:32.239 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:b2e9496d-c7f9-d770-3d3a-cc5f47d55581, VDI OpaqueRef:21fe3efb-6231-8888-5a8e-a5d43ecf25f3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:21:32.460 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:70423f4b-0d95-ff41-c5a6-e46fac9a3014 for VM OpaqueRef:b2e9496d-c7f9-d770-3d3a-cc5f47d55581, VDI OpaqueRef:21fe3efb-6231-8888-5a8e-a5d43ecf25f3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:21:36.897 DEBUG nova.virt.xenapi.vmops [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:21:37.027 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 696c794e-e4d8-4bd9-8b55-447342cd9b8f] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:21:37.070 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:37.071 DEBUG nova.virt.xenapi.vm_utils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:21:37.072 DEBUG nova.compute.manager [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:21:37.416 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2af2868e-6009-4d8b-b3ee-90b4d32a3ddd] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:21:38.050 DEBUG nova.virt.xenapi.vmops [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:21:38.071 DEBUG nova.virt.xenapi.vm_utils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 74d18b2d-0150-43d6-b753-098c7f889676 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:38.080 DEBUG nova.virt.xenapi.vm_utils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 59bf10f8-4dee-43ad-a236-ca31af17fb37 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:38.470 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:38.471 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:38.472 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:21:38.754 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5364 2015-08-07 17:21:38.754 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:21:38.755 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:39.049 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:46.348 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:46.358 DEBUG nova.compute.manager [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:16:21Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=16,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=bcf8fed4-5825-4f82-a05c-9338adad0cda,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:16:26Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:21:46.996 DEBUG nova.virt.xenapi.vmops [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:21:47.146 DEBUG nova.virt.xenapi.vm_utils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI 0f94779a-0091-430a-b20f-eab92fb8aabb is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:47.319 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:21:47.320 DEBUG nova.objects.instance [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `numa_topology' on Instance uuid bcf8fed4-5825-4f82-a05c-9338adad0cda obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:21:47.396 DEBUG nova.virt.xenapi.vm_utils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] VDI e05becdd-39b2-453c-acae-cbb07ea9675d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:21:48.693 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 1.374s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:50.014 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:50.015 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:50.016 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:50.016 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:50.017 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:21:50.017 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:21:50.058 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:21:50.059 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:21:51.001 DEBUG nova.virt.xenapi.vmops [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:21:51.024 DEBUG nova.virt.xenapi.vm_utils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:21:51.025 DEBUG nova.compute.manager [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:21:51.042 DEBUG oslo_concurrency.lockutils [req-aebcfb5f-a29b-4fd4-b43e-bf24741017de tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "bcf8fed4-5825-4f82-a05c-9338adad0cda" released by "do_terminate_instance" :: held 83.878s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:51.216 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:21:51.217 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:21:53.021 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.795s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:55.075 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.15 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:55.199 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:21:55.199 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:21:55.200 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:21:55.200 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:21:57.233 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:21:57.233 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:21:57.875 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:21:57.875 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 2.675s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:57.876 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 24.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:21:58.143 DEBUG nova.compute.manager [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:04:48Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=6,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=fda3bb4d-ccb2-4500-8c24-8c38815626fa,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:04:50Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:21:58.516 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:21:58.518 DEBUG nova.objects.instance [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `numa_topology' on Instance uuid fda3bb4d-ccb2-4500-8c24-8c38815626fa obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:21:58.783 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.266s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:21:59.331 DEBUG oslo_concurrency.lockutils [req-ad264d08-0198-4acc-bb84-ffdb626b78a4 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "fda3bb4d-ccb2-4500-8c24-8c38815626fa" released by "do_terminate_instance" :: held 85.065s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:00.360 DEBUG nova.virt.xenapi.vmops [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:22:00.376 DEBUG nova.virt.xenapi.vm_utils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:22:00.377 DEBUG nova.compute.manager [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:22:04.454 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:07.019 DEBUG nova.virt.xenapi.vmops [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:22:07.035 DEBUG nova.virt.xenapi.vm_utils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:22:07.036 DEBUG nova.compute.manager [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:22:07.342 DEBUG nova.compute.manager [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:06:32Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=8,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=091727e3-644b-4029-98ad-5a102868d2d5,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:06:37Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:22:08.051 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:08.053 DEBUG nova.objects.instance [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `numa_topology' on Instance uuid 091727e3-644b-4029-98ad-5a102868d2d5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:22:08.495 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.444s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:09.746 DEBUG oslo_concurrency.lockutils [req-5cfad40a-28bc-4158-96eb-70342ae8ef7f tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "091727e3-644b-4029-98ad-5a102868d2d5" released by "do_terminate_instance" :: held 92.397s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:10.806 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpESQc5O/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 40.192s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:10.808 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:12.144 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:22:12.241 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:22:12.301 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:8147bb98-d31b-ee0f-2505-78e78a0461bf for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:22:12.302 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Plugging VBD OpaqueRef:8147bb98-d31b-ee0f-2505-78e78a0461bf ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:22:12.303 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:13.278 DEBUG oslo_concurrency.processutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 2.469s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:13.279 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Destroying VBD for VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:22:13.527 DEBUG nova.compute.manager [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:03:24Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=4,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=cf8ff930-deb3-448f-bcd3-f5a84fdad9b4,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:03:27Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:22:14.009 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:14.015 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:14.017 DEBUG nova.objects.instance [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lazy-loading `numa_topology' on Instance uuid cf8ff930-deb3-448f-bcd3-f5a84fdad9b4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:22:14.080 INFO nova.compute.manager [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Starting instance... 2015-08-07 17:22:14.151 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "compute_resources" released by "update_usage" :: held 0.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:14.433 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:14.595 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:14.596 DEBUG nova.compute.resource_tracker [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:22:14.604 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:22:14.604 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:22:14.605 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:22:14.605 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:22:14.606 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] disk limit not specified, defaulting to unlimited 2015-08-07 17:22:14.631 DEBUG nova.compute.resources.vcpu [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:22:14.652 DEBUG nova.compute.resources.vcpu [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:22:14.652 INFO nova.compute.claims [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Claim successful 2015-08-07 17:22:15.063 DEBUG oslo_concurrency.lockutils [req-ac43354e-24e5-41fe-8d5b-41b0cab26176 tempest-ServersAdminTestJSON-1589181083 tempest-ServersAdminTestJSON-1030271038] Lock "cf8ff930-deb3-448f-bcd3-f5a84fdad9b4" released by "do_terminate_instance" :: held 110.566s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:15.193 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" released by "instance_claim" :: held 0.597s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:15.533 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:15.908 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" released by "update_usage" :: held 0.375s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:15.909 DEBUG nova.compute.utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:22:15.913 13318 DEBUG nova.compute.manager [-] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:22:15.915 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:22:15.919 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.617s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:15.937 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Plugging VBD OpaqueRef:8147bb98-d31b-ee0f-2505-78e78a0461bf done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:22:15.938 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 2.658s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:15.979 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VBD OpaqueRef:8147bb98-d31b-ee0f-2505-78e78a0461bf plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:22:16.150 WARNING nova.virt.configdrive [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:22:16.151 DEBUG nova.objects.instance [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `ec2_ids' on Instance uuid 4bb27132-2638-467d-a5be-49e1ad01b113 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:22:16.321 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): genisoimage -o /tmp/tmpU0H0O1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbRaXQ8 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:17.078 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "genisoimage -o /tmp/tmpU0H0O1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbRaXQ8" returned: 0 in 0.758s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:17.224 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpU0H0O1/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:18.922 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:22:18.973 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:22:18.974 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:19.853 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:22:19.891 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.953s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:19.899 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Destroying VBD for VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:22:19.900 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:f4c40d70-c549-651a-7c25-e04d327051c2, VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:22:19.909 DEBUG nova.virt.xenapi.vm_utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:5097a478-bc16-34ef-8cd2-2561a8211315 for VM OpaqueRef:f4c40d70-c549-651a-7c25-e04d327051c2, VDI OpaqueRef:7f6e54b9-35e5-38ef-3d29-64d61a19aa9e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:22:19.910 DEBUG nova.objects.instance [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `pci_devices' on Instance uuid 45057175-41ca-4ad9-96c4-36ae4b86e6d4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:22:19.934 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:20.126 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:20.588 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:20.589 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:20.590 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:20.646 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "store_auto_disk_config" :: held 0.056s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:20.667 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Injecting hostname (tempest.common.compute-instance-1079309654) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:22:20.668 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:20.685 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "update_hostname" :: held 0.017s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:20.686 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:22:20.686 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:21.532 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "update_nwinfo" :: held 0.846s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:21.533 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:22.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:22.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:22.516 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:22:22.516 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:22:22.556 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:22:22.597 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:22:22.606 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Created VIF OpaqueRef:f29afd5f-2e66-0472-bfe5-f60853a388cf, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:22:22.607 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:22.628 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:22:22.629 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:22:22.629 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:22:22.630 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:22:22.632 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:23.305 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:22:24.645 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:24.656 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:25.598 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:26.578 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Cloned VDI OpaqueRef:af586590-4a0b-1675-a4a4-15945ae8f3fb from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:22:26.901 13318 DEBUG nova.network.base_api [-] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:07:59:ae', 'active': False, 'type': u'bridge', 'id': u'c5dcdb81-a2ab-40b3-9886-dfd0525af78c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:22:26.964 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:22:26.965 13318 DEBUG nova.compute.manager [-] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:07:59:ae', 'active': False, 'type': u'bridge', 'id': u'c5dcdb81-a2ab-40b3-9886-dfd0525af78c', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:22:28.507 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:28.508 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:28.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:28.517 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:29.103 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 9.169s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:29.104 INFO nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Image creation data, cacheable: True, downloaded: False duration: 9.25 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:22:29.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:29.515 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:22:29.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:29.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:31.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:31.668 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:22:31.703 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:22:31.735 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:32.470 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:32.470 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:22:32.915 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:33.681 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:22:33.717 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:22:33.717 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:22:33.943 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.473s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:34.430 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Creating disk-type VBD for VM OpaqueRef:b2940078-8522-5b37-c613-a3e4c33adcc1, VDI OpaqueRef:af586590-4a0b-1675-a4a4-15945ae8f3fb ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:22:34.442 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Created VBD OpaqueRef:eafa2b69-2483-49e0-4960-eb9f28bf4636 for VM OpaqueRef:b2940078-8522-5b37-c613-a3e4c33adcc1, VDI OpaqueRef:af586590-4a0b-1675-a4a4-15945ae8f3fb. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:22:34.807 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:35.154 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:22:35.154 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:22:35.155 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:22:35.155 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:35.742 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:22:35.743 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:22:35.971 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Created VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:22:35.978 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:22:35.991 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Created VBD OpaqueRef:b529bba7-24af-cfe9-5898-65714c64d776 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:22:35.991 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Plugging VBD OpaqueRef:b529bba7-24af-cfe9-5898-65714c64d776 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:22:35.992 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:36.036 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:22:36.037 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.882s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:36.038 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:42.601 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 6.609s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:42.602 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Plugging VBD OpaqueRef:b529bba7-24af-cfe9-5898-65714c64d776 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:22:42.608 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] VBD OpaqueRef:b529bba7-24af-cfe9-5898-65714c64d776 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:22:42.738 WARNING nova.virt.configdrive [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:22:42.804 DEBUG nova.objects.instance [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lazy-loading `ec2_ids' on Instance uuid 2d94b230-ee5f-44bb-9ce8-17e52b082de7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:22:42.978 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Running cmd (subprocess): genisoimage -o /tmp/tmp5yLURo/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpAv5UP8 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:43.475 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:22:43.480 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 39.03 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:43.645 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] CMD "genisoimage -o /tmp/tmp5yLURo/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpAv5UP8" returned: 0 in 0.667s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:43.649 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp5yLURo/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:44.528 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:46.752 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:47.352 INFO nova.compute.manager [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Starting instance... 2015-08-07 17:22:47.838 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:47.839 DEBUG nova.compute.resource_tracker [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:22:47.848 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:22:47.849 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:22:47.849 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:22:47.850 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:22:47.850 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] disk limit not specified, defaulting to unlimited 2015-08-07 17:22:47.915 DEBUG nova.compute.resources.vcpu [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:22:47.916 DEBUG nova.compute.resources.vcpu [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:22:47.916 INFO nova.compute.claims [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Claim successful 2015-08-07 17:22:49.765 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "instance_claim" :: held 1.927s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:50.526 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpU0H0O1/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 33.302s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:50.528 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:22:51.465 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:22:52.605 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "update_usage" :: held 1.140s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:22:52.613 DEBUG nova.compute.utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:22:52.641 13318 DEBUG nova.compute.manager [-] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:22:52.642 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-448f07ac-11c1-4844-84f7-c887efb5826a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:22:57.019 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.29 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:22:57.301 DEBUG oslo_concurrency.processutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 6.772s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:22:57.302 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Destroying VBD for VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:22:57.303 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:07.031 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:23:07.978 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:10.322 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 3.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:11.324 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:23:11.325 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:23:11.325 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:11.508 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "update_hostname" :: held 0.183s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:11.509 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:12.270 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 14.967s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:12.281 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Destroying VBD for VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:23:12.282 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Creating disk-type VBD for VM OpaqueRef:b2e9496d-c7f9-d770-3d3a-cc5f47d55581, VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:23:12.323 DEBUG nova.virt.xenapi.vm_utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Created VBD OpaqueRef:cdb55fe1-6181-7713-40e2-9844714d4eba for VM OpaqueRef:b2e9496d-c7f9-d770-3d3a-cc5f47d55581, VDI OpaqueRef:f6aa229d-74d4-9207-6ed5-dd94459d78fe. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:23:12.324 DEBUG nova.objects.instance [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `pci_devices' on Instance uuid 4bb27132-2638-467d-a5be-49e1ad01b113 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:23:12.578 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:13.905 DEBUG nova.virt.xenapi.vmops [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:13.934 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:13.935 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:13.935 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:13.946 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:13.947 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Injecting hostname (tempest.common.compute-instance-1619978930) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:23:13.948 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:13.970 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" released by "update_hostname" :: held 0.022s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:13.971 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:23:13.971 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:14.023 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:23:14.074 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:23:14.075 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:14.550 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:14.756 DEBUG nova.compute.manager [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:23:14.771 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:23:14.839 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:14.864 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" released by "update_nwinfo" :: held 0.893s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:14.865 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:15.998 DEBUG nova.compute.utils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Unexpected task state: expecting (u'spawning',) but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:23:15.999 DEBUG nova.compute.manager [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:23:16.000 DEBUG nova.compute.manager [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:23:16.961 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:23:17.225 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:23:17.401 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Created VIF OpaqueRef:eeec1b57-cd17-9cde-f3f5-081eb377a7a0, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:23:17.402 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:18.382 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:23:18.384 DEBUG oslo_concurrency.lockutils [req-21a74425-4ba4-45b0-afd7-2c7b006729a0 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "_locked_do_build_and_run_instance" :: held 236.887s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:18.385 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4" acquired by "do_terminate_instance" :: waited 5.224s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:18.386 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:18.386 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:18.388 INFO nova.compute.manager [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Terminating instance 2015-08-07 17:23:18.389 INFO nova.virt.xenapi.vmops [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Destroying VM 2015-08-07 17:23:18.441 DEBUG nova.virt.xenapi.vm_utils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:23:22.270 13318 DEBUG nova.network.base_api [-] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:c9:ee', 'active': False, 'type': u'bridge', 'id': u'acaf6f31-eac3-4d64-965f-7166ef5f3894', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:23:22.299 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-448f07ac-11c1-4844-84f7-c887efb5826a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:23:22.299 13318 DEBUG nova.compute.manager [-] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:c9:ee', 'active': False, 'type': u'bridge', 'id': u'acaf6f31-eac3-4d64-965f-7166ef5f3894', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:23:22.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:22.628 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:22.634 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:22.635 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:22.636 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:23:22.636 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:23:22.722 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:23:22.723 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:23:22.723 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:23:22.724 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-45057175-41ca-4ad9-96c4-36ae4b86e6d4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:23:22.724 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 45057175-41ca-4ad9-96c4-36ae4b86e6d4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:23:23.160 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:23:23.628 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Cloned VDI OpaqueRef:9365f655-c0e3-4b0f-bb2c-acf250e0bc99 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:23:24.150 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-45057175-41ca-4ad9-96c4-36ae4b86e6d4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:23:24.151 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:23:24.151 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:27.030 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:27.031 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:28.227 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.10 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:28.345 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 13.505s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:28.346 INFO nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Image creation data, cacheable: True, downloaded: False duration: 13.57 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:23:28.507 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:28.596 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:29.603 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:29.604 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:29.990 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp5yLURo/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 46.341s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:23:29.991 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:23:31.206 DEBUG oslo_concurrency.processutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.214s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:23:31.209 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Destroying VBD for VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:23:31.210 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:31.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:31.516 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:23:31.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:31.518 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:32.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:32.589 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:23:32.590 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:23:35.040 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:37.272 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.07 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:37.446 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:37.447 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:23:39.129 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:41.874 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:23:41.994 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:23:41.994 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:45.603 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:46.795 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 9.348s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:47.261 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:1e9bd9ab-9495-bd16-1cea-64bbed200034, VDI OpaqueRef:9365f655-c0e3-4b0f-bb2c-acf250e0bc99 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:23:47.361 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:dc01bff7-a47e-653d-b9ac-6932df681dfc for VM OpaqueRef:1e9bd9ab-9495-bd16-1cea-64bbed200034, VDI OpaqueRef:9365f655-c0e3-4b0f-bb2c-acf250e0bc99. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:23:48.610 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 17.399s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:48.685 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Destroying VBD for VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:23:48.686 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Creating disk-type VBD for VM OpaqueRef:b2940078-8522-5b37-c613-a3e4c33adcc1, VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:23:48.838 DEBUG nova.virt.xenapi.vm_utils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Created VBD OpaqueRef:30f6ecd3-ce60-afbb-d731-8382ee19cfef for VM OpaqueRef:b2940078-8522-5b37-c613-a3e4c33adcc1, VDI OpaqueRef:a148b057-ae69-c916-df83-bdeb734d4a07. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:23:48.839 DEBUG nova.objects.instance [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lazy-loading `pci_devices' on Instance uuid 2d94b230-ee5f-44bb-9ce8-17e52b082de7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:23:49.129 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:49.402 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:23:49.519 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:23:49.520 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:23:49.520 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:50.166 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:50.167 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:50.168 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:50.174 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:23:50.174 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=788MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:23:50.351 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "store_auto_disk_config" :: held 0.183s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:50.352 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Injecting hostname (tempest-server-730902090) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:23:50.352 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:50.367 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:23:50.367 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.847s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:50.370 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:50.460 DEBUG nova.virt.xenapi.vmops [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:23:50.462 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "update_hostname" :: held 0.110s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:50.463 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:23:50.463 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:50.484 DEBUG nova.virt.xenapi.vm_utils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VDI e35b03af-1cc0-4009-a39b-587e5a182915 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:23:50.594 DEBUG nova.virt.xenapi.vm_utils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VDI 121be489-3d3a-4d7a-8c5d-1c4e0d725a09 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:23:51.238 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:23:51.312 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:23:51.326 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:76f39376-6a19-8875-0479-4d678779a382 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:23:51.327 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Plugging VBD OpaqueRef:76f39376-6a19-8875-0479-4d678779a382 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:23:51.328 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:23:52.251 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "update_nwinfo" :: held 1.788s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:23:52.251 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:54.037 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:23:54.227 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:23:54.915 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Created VIF OpaqueRef:4281e1b8-d142-3383-c223-56d9a7e3980d, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:23:55.450 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:23:56.160 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.46 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:57.073 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:23:57.371 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:23:57.372 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 25.14 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:23:59.299 DEBUG nova.virt.xenapi.vmops [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:23:59.323 DEBUG nova.virt.xenapi.vm_utils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:23:59.324 DEBUG nova.compute.manager [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:24:03.496 DEBUG nova.compute.manager [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:19:20Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=17,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=45057175-41ca-4ad9-96c4-36ae4b86e6d4,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:19:24Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:24:04.675 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:04.676 DEBUG nova.objects.instance [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `numa_topology' on Instance uuid 45057175-41ca-4ad9-96c4-36ae4b86e6d4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:24:05.179 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:05.718 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "update_usage" :: held 1.042s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:08.892 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 17.564s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:08.893 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Plugging VBD OpaqueRef:76f39376-6a19-8875-0479-4d678779a382 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:24:09.016 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VBD OpaqueRef:76f39376-6a19-8875-0479-4d678779a382 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:24:09.141 WARNING nova.virt.configdrive [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:24:09.530 DEBUG nova.objects.instance [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `ec2_ids' on Instance uuid 448f07ac-11c1-4844-84f7-c887efb5826a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:24:09.573 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): genisoimage -o /tmp/tmp7zcEVn/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPoQIyw execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:24:10.979 DEBUG oslo_concurrency.lockutils [req-31e6a261-17c8-4515-bad1-97b950dc7d17 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "45057175-41ca-4ad9-96c4-36ae4b86e6d4" released by "do_terminate_instance" :: held 52.593s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:11.128 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "genisoimage -o /tmp/tmp7zcEVn/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPoQIyw" returned: 0 in 1.554s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:24:11.132 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7zcEVn/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:24:15.904 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:22.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:22.516 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:24:22.517 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:24:22.653 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:24:22.660 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:24:22.661 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:24:22.661 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:24:22.665 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:24.665 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:24.666 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:24.801 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:27.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:27.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:27.546 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:24:27.695 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:28.406 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:24:28.407 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:24:28.408 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:28.418 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "xenstore-4bb27132-2638-467d-a5be-49e1ad01b113" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:29.147 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:29.508 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:29.509 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:29.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:29.518 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:29.744 DEBUG nova.virt.xenapi.vmops [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:30.943 DEBUG nova.compute.manager [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:24:31.664 DEBUG nova.compute.utils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Unexpected task state: expecting (u'spawning',) but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:24:31.666 DEBUG nova.compute.manager [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:24:31.666 DEBUG nova.compute.manager [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:24:32.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:32.518 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:24:32.519 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:33.256 DEBUG oslo_concurrency.lockutils [req-ea4bd899-411d-4a36-9599-4f405cef238e tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113" released by "_locked_do_build_and_run_instance" :: held 309.614s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:33.258 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113" acquired by "do_terminate_instance" :: waited 109.777s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:33.259 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:33.259 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:33.261 INFO nova.compute.manager [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Terminating instance 2015-08-07 17:24:33.265 INFO nova.virt.xenapi.vmops [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Destroying VM 2015-08-07 17:24:33.424 DEBUG nova.virt.xenapi.vm_utils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:24:33.548 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:33.588 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:24:33.933 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:24:37.026 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:37.027 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:24:38.479 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.16 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:39.284 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.258s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:41.078 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:24:41.079 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:24:41.079 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:24:41.080 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:41.547 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:24:41.548 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:24:41.947 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:24:41.947 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.868s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:41.948 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:41.948 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:42.508 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:24:42.539 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:43.031 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:24:43.031 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:24:43.032 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:43.038 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "xenstore-2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:43.039 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:43.460 DEBUG nova.virt.xenapi.vmops [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:43.564 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7zcEVn/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 32.432s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:24:43.567 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:24:43.963 DEBUG nova.compute.manager [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:24:44.563 DEBUG oslo_concurrency.processutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.996s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:24:44.564 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Destroying VBD for VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:24:44.565 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:44.996 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:45.766 DEBUG oslo_concurrency.lockutils [req-db0df955-3b45-4eb8-9421-1d51f0d82a68 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "_locked_do_build_and_run_instance" :: held 151.754s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:46.655 INFO nova.compute.manager [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Rebooting instance 2015-08-07 17:24:46.688 DEBUG oslo_concurrency.lockutils [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Acquired semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:24:46.950 DEBUG nova.network.base_api [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:07:59:ae', 'active': False, 'type': u'bridge', 'id': u'c5dcdb81-a2ab-40b3-9886-dfd0525af78c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:24:47.001 DEBUG oslo_concurrency.lockutils [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Releasing semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:24:47.003 DEBUG nova.compute.manager [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:24:48.461 DEBUG nova.virt.xenapi.vmops [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:24:48.494 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.929s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:48.527 DEBUG nova.virt.xenapi.vm_utils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VDI cf5de1c1-f9e0-4e48-977f-a40a1e5d2875 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:24:48.536 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Destroying VBD for VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:24:48.539 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:1e9bd9ab-9495-bd16-1cea-64bbed200034, VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:24:48.583 DEBUG nova.virt.xenapi.vm_utils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:ec00e851-04d9-88cf-6f88-5574a0ce3b58 for VM OpaqueRef:1e9bd9ab-9495-bd16-1cea-64bbed200034, VDI OpaqueRef:7bc903e1-f6d9-c384-e2e2-968a18fa8d19. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:24:48.584 DEBUG nova.objects.instance [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `pci_devices' on Instance uuid 448f07ac-11c1-4844-84f7-c887efb5826a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:24:48.605 DEBUG nova.virt.xenapi.vm_utils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] VDI f440fc70-ace8-44b6-af38-d5dedfb5ae62 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:24:48.745 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:49.728 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:49.728 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:49.730 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:49.767 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" released by "store_auto_disk_config" :: held 0.036s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:49.768 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Injecting hostname (tempest.common.compute-instance-814344876) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:24:49.769 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:49.796 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" released by "update_hostname" :: held 0.027s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:49.797 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:24:49.798 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:49.928 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:24:50.005 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 11.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:50.287 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" released by "update_nwinfo" :: held 0.490s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:50.288 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:50.757 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:24:50.828 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:24:50.855 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Created VIF OpaqueRef:1dd3b38c-81c9-693f-ac8f-83875422c4be, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:24:50.884 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:24:51.259 DEBUG nova.virt.xenapi.vmops [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:24:51.289 DEBUG nova.virt.xenapi.vm_utils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:24:51.290 DEBUG nova.compute.manager [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:24:51.719 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:24:53.285 DEBUG nova.compute.manager [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:19:22Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=18,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=4bb27132-2638-467d-a5be-49e1ad01b113,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:19:29Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:24:53.544 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:53.545 DEBUG nova.objects.instance [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lazy-loading `numa_topology' on Instance uuid 4bb27132-2638-467d-a5be-49e1ad01b113 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:24:53.663 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "compute_resources" released by "update_usage" :: held 0.119s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:54.260 DEBUG oslo_concurrency.lockutils [req-81780a41-6d87-4e01-93a9-20ab0a1acf38 tempest-ListImageFiltersTestJSON-1638241879 tempest-ListImageFiltersTestJSON-1489545785] Lock "4bb27132-2638-467d-a5be-49e1ad01b113" released by "do_terminate_instance" :: held 21.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:24:54.763 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:24:58.080 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:58.659 INFO nova.compute.manager [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Starting instance... 2015-08-07 17:24:59.238 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:24:59.240 DEBUG nova.compute.resource_tracker [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:24:59.248 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:24:59.249 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:24:59.249 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:24:59.250 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:24:59.251 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] disk limit not specified, defaulting to unlimited 2015-08-07 17:24:59.274 DEBUG nova.compute.resources.vcpu [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:24:59.275 DEBUG nova.compute.resources.vcpu [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:24:59.275 INFO nova.compute.claims [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Claim successful 2015-08-07 17:25:00.081 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "instance_claim" :: held 0.842s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:00.436 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:00.565 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "update_usage" :: held 0.129s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:00.567 DEBUG nova.compute.utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:25:00.571 13318 DEBUG nova.compute.manager [-] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:25:00.572 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1da4a346-da42-46e7-81c1-b0085c1ca90a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:25:01.540 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:01.650 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 21.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:02.047 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:25:02.173 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:25:02.174 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:03.078 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:25:03.252 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:05.651 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.04 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:08.098 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Cloned VDI OpaqueRef:22e245f5-8608-ab1e-5607-20e893308f48 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:25:08.548 13318 DEBUG nova.network.base_api [-] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.13'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c7:1e:2f', 'active': False, 'type': u'bridge', 'id': u'b501abb4-30aa-4235-abb8-ff79d2316ad7', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:25:08.606 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1da4a346-da42-46e7-81c1-b0085c1ca90a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:25:08.607 13318 DEBUG nova.compute.manager [-] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.13'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c7:1e:2f', 'active': False, 'type': u'bridge', 'id': u'b501abb4-30aa-4235-abb8-ff79d2316ad7', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:25:10.268 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 7.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:10.269 INFO nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Image creation data, cacheable: True, downloaded: False duration: 7.19 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:25:12.817 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:13.323 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:13.696 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:25:13.718 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:25:13.719 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:14.359 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:c00ea8ac-a649-b72a-aa73-807b719dfdce, VDI OpaqueRef:22e245f5-8608-ab1e-5607-20e893308f48 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:25:14.401 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:1fd1d2ef-35f4-7a17-fa2c-7b55d1d7f0a5 for VM OpaqueRef:c00ea8ac-a649-b72a-aa73-807b719dfdce, VDI OpaqueRef:22e245f5-8608-ab1e-5607-20e893308f48. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:25:15.024 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:15.758 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:25:15.776 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:25:15.788 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:bc8a723c-9b7c-0203-63cb-d6a8d6056f58 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:25:15.788 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Plugging VBD OpaqueRef:bc8a723c-9b7c-0203-63cb-d6a8d6056f58 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:25:15.789 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:20.054 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:25:20.411 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:21.944 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:25:21.945 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:25:21.946 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:21.968 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-448f07ac-11c1-4844-84f7-c887efb5826a" released by "update_hostname" :: held 0.022s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:21.969 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:22.956 DEBUG nova.virt.xenapi.vmops [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:23.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:23.602 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:23.642 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:23.643 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:25:23.643 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:25:23.996 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:25:23.997 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:25:23.998 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:25:23.998 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 2d94b230-ee5f-44bb-9ce8-17e52b082de7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:25:24.026 DEBUG nova.compute.manager [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:25:25.035 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:25.318 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:07:59:ae', 'active': False, 'type': u'bridge', 'id': u'c5dcdb81-a2ab-40b3-9886-dfd0525af78c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:25:25.761 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-2d94b230-ee5f-44bb-9ce8-17e52b082de7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:25:25.762 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:25:25.762 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:25.915 DEBUG oslo_concurrency.lockutils [req-4408bbee-0f01-405c-aca7-22d3b9daaa9f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a" released by "_locked_do_build_and_run_instance" :: held 159.160s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:26.191 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 10.402s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:26.191 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Plugging VBD OpaqueRef:bc8a723c-9b7c-0203-63cb-d6a8d6056f58 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:25:26.202 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VBD OpaqueRef:bc8a723c-9b7c-0203-63cb-d6a8d6056f58 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:25:26.447 WARNING nova.virt.configdrive [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:25:26.448 DEBUG nova.objects.instance [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `ec2_ids' on Instance uuid 1da4a346-da42-46e7-81c1-b0085c1ca90a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:25:26.519 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): genisoimage -o /tmp/tmpJPZvLI/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqqdSsC execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:25:27.024 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "genisoimage -o /tmp/tmpJPZvLI/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqqdSsC" returned: 0 in 0.505s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:25:27.029 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJPZvLI/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:25:27.364 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:27.375 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:27.376 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:27.387 INFO nova.compute.manager [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Terminating instance 2015-08-07 17:25:27.395 INFO nova.virt.xenapi.vmops [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Destroying VM 2015-08-07 17:25:27.526 DEBUG nova.virt.xenapi.vm_utils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:25:28.638 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:28.638 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:28.639 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:28.640 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:30.143 DEBUG nova.compute.manager [req-c0719406-168b-42c3-bde9-7b681f88b0da tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:25:30.570 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:30.571 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:31.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:31.512 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:32.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:32.519 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:25:32.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:32.932 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:32.933 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:32.934 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:32.936 INFO nova.compute.manager [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Terminating instance 2015-08-07 17:25:32.937 INFO nova.virt.xenapi.vmops [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Destroying VM 2015-08-07 17:25:32.955 DEBUG nova.virt.xenapi.vm_utils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:25:33.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:33.553 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:25:33.554 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:25:33.971 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:33.971 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:25:34.936 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:36.020 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.049s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:36.743 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:25:36.751 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:25:36.751 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:25:36.752 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:37.217 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:25:37.218 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:25:37.337 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:25:37.338 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.586s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:37.339 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:37.339 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.18 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:37.774 DEBUG nova.virt.xenapi.vmops [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:25:37.890 DEBUG nova.virt.xenapi.vm_utils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VDI d87eca56-150e-4d05-988a-9d5a3969f3c7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:25:37.940 DEBUG nova.virt.xenapi.vm_utils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VDI 66789b72-9302-4e54-93db-92e5df272c21 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:25:39.923 DEBUG nova.virt.xenapi.vmops [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:25:40.764 DEBUG nova.virt.xenapi.vm_utils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:25:40.765 DEBUG nova.compute.manager [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:25:41.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:41.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:25:41.733 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 6 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:25:41.734 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4bb27132-2638-467d-a5be-49e1ad01b113] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:42.047 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 45057175-41ca-4ad9-96c4-36ae4b86e6d4] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:42.389 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: bcf8fed4-5825-4f82-a05c-9338adad0cda] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:43.106 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 091727e3-644b-4029-98ad-5a102868d2d5] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:43.811 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fda3bb4d-ccb2-4500-8c24-8c38815626fa] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:44.177 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: cf8ff930-deb3-448f-bcd3-f5a84fdad9b4] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:25:44.466 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:44.780 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:44.930 DEBUG nova.compute.manager [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:22:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=20,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=448f07ac-11c1-4844-84f7-c887efb5826a,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:22:52Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:25:45.209 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:45.210 DEBUG nova.objects.instance [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `numa_topology' on Instance uuid 448f07ac-11c1-4844-84f7-c887efb5826a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:25:45.311 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "update_usage" :: held 0.102s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:45.465 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:25:45.468 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 38.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:45.646 DEBUG oslo_concurrency.lockutils [req-5ea7183c-dbf3-47e4-8f29-f0ceb6f8c2ed tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "448f07ac-11c1-4844-84f7-c887efb5826a" released by "do_terminate_instance" :: held 18.282s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:47.195 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJPZvLI/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 20.166s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:25:47.197 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:25:48.345 DEBUG oslo_concurrency.processutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.149s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:25:48.349 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Destroying VBD for VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:25:48.351 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:48.589 DEBUG nova.virt.xenapi.vmops [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:25:48.618 DEBUG nova.virt.xenapi.vm_utils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] VDI 575019c5-6372-4f64-aefd-567c6fc489da is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:25:48.660 DEBUG nova.virt.xenapi.vm_utils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] VDI 86d5c65d-961d-42b8-8656-47e6953f6fa2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:25:49.079 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:49.182 INFO nova.compute.manager [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Starting instance... 2015-08-07 17:25:49.560 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:49.561 DEBUG nova.compute.resource_tracker [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:25:49.568 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:25:49.569 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:25:49.570 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:25:49.570 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:25:49.570 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] disk limit not specified, defaulting to unlimited 2015-08-07 17:25:49.594 DEBUG nova.compute.resources.vcpu [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:25:49.595 DEBUG nova.compute.resources.vcpu [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:25:49.595 INFO nova.compute.claims [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Claim successful 2015-08-07 17:25:50.129 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "instance_claim" :: held 0.568s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:50.444 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:50.560 DEBUG nova.virt.xenapi.vmops [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:25:50.587 DEBUG nova.virt.xenapi.vm_utils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:25:50.588 DEBUG nova.compute.manager [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:25:50.680 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.330s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:50.733 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Destroying VBD for VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:25:50.734 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:c00ea8ac-a649-b72a-aa73-807b719dfdce, VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:25:50.743 DEBUG nova.virt.xenapi.vm_utils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:6669066a-5765-94a2-a4bc-fc0d19e599de for VM OpaqueRef:c00ea8ac-a649-b72a-aa73-807b719dfdce, VDI OpaqueRef:dbfec09c-bd3b-e495-9e9c-3ed378ced522. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:25:50.744 DEBUG nova.objects.instance [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `pci_devices' on Instance uuid 1da4a346-da42-46e7-81c1-b0085c1ca90a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:25:50.838 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "update_usage" :: held 0.394s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:50.840 DEBUG nova.compute.utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:25:50.850 13318 DEBUG nova.compute.manager [-] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:25:50.851 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:25:50.964 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:51.328 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:51.387 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "store_meta" :: held 0.059s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:51.388 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:51.428 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "store_auto_disk_config" :: held 0.039s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:51.428 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Injecting hostname (tempest-server-1955342466) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:25:51.429 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:51.485 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "update_hostname" :: held 0.056s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:51.486 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:25:51.486 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:51.938 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:25:51.958 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:25:51.959 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:52.082 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "update_nwinfo" :: held 0.596s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:52.083 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:52.410 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:25:52.425 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:52.642 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:25:52.670 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:25:52.701 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Created VIF OpaqueRef:8edd548e-ea3d-4bb1-56ed-067176d481e6, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:25:52.702 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:25:53.263 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:25:54.825 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:25:55.334 DEBUG nova.compute.manager [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:22:13Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=19,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=2d94b230-ee5f-44bb-9ce8-17e52b082de7,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:22:15Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:25:55.717 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:25:55.718 DEBUG nova.objects.instance [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lazy-loading `numa_topology' on Instance uuid 2d94b230-ee5f-44bb-9ce8-17e52b082de7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:25:55.792 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Cloned VDI OpaqueRef:8fe42f21-b05b-30ac-3e1e-680d01944f7f from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:25:55.847 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "compute_resources" released by "update_usage" :: held 0.130s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:56.361 DEBUG oslo_concurrency.lockutils [req-9bfe5147-538d-4f51-a2c1-9c1ce93b7f80 tempest-SecurityGroupsTestJSON-162787647 tempest-SecurityGroupsTestJSON-1523642313] Lock "2d94b230-ee5f-44bb-9ce8-17e52b082de7" released by "do_terminate_instance" :: held 23.428s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:58.170 13318 DEBUG nova.network.base_api [-] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:0b:22:5f', 'active': False, 'type': u'bridge', 'id': u'9d530fd0-08d7-42a6-b175-6f6010a0ad54', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:25:58.211 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:25:58.211 13318 DEBUG nova.compute.manager [-] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:0b:22:5f', 'active': False, 'type': u'bridge', 'id': u'9d530fd0-08d7-42a6-b175-6f6010a0ad54', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:25:58.280 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.854s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:25:58.281 INFO nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Image creation data, cacheable: True, downloaded: False duration: 5.87 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:25:59.727 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:00.046 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:00.390 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:26:00.406 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:26:00.407 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:00.707 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:65080da7-3267-c7b6-379d-ef255c1a4bf9, VDI OpaqueRef:8fe42f21-b05b-30ac-3e1e-680d01944f7f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:00.720 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:390754f5-42de-f156-7c9c-5a23370894ed for VM OpaqueRef:65080da7-3267-c7b6-379d-ef255c1a4bf9, VDI OpaqueRef:8fe42f21-b05b-30ac-3e1e-680d01944f7f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:04.120 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:26:04.127 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:04.176 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:ff87842b-ac48-6ed9-73e6-3dc0fb4bd2c0 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:04.194 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Plugging VBD OpaqueRef:ff87842b-ac48-6ed9-73e6-3dc0fb4bd2c0 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:26:04.196 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:05.759 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:08.566 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 4.370s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:08.567 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Plugging VBD OpaqueRef:ff87842b-ac48-6ed9-73e6-3dc0fb4bd2c0 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:26:08.573 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VBD OpaqueRef:ff87842b-ac48-6ed9-73e6-3dc0fb4bd2c0 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:26:08.670 WARNING nova.virt.configdrive [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:26:08.671 DEBUG nova.objects.instance [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `ec2_ids' on Instance uuid 18c3ad3b-8cd4-4e41-b278-93a63e82aac4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:08.714 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): genisoimage -o /tmp/tmpQBagd_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp6_QDFb execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:09.113 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "genisoimage -o /tmp/tmpQBagd_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp6_QDFb" returned: 0 in 0.399s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:09.119 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpQBagd_/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:12.542 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:26:12.570 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:13.022 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:26:13.023 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:26:13.024 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:13.068 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "update_hostname" :: held 0.045s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:13.069 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:13.521 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:13.601 INFO nova.compute.manager [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Starting instance... 2015-08-07 17:26:13.626 DEBUG nova.virt.xenapi.vmops [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:14.013 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:14.014 DEBUG nova.compute.resource_tracker [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:26:14.025 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:26:14.026 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:26:14.026 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:26:14.027 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:26:14.027 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] disk limit not specified, defaulting to unlimited 2015-08-07 17:26:14.071 DEBUG nova.compute.resources.vcpu [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:26:14.084 DEBUG nova.compute.resources.vcpu [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:26:14.085 INFO nova.compute.claims [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Claim successful 2015-08-07 17:26:14.211 DEBUG nova.compute.manager [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:26:14.879 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "instance_claim" :: held 0.865s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:14.909 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:14.924 DEBUG oslo_concurrency.lockutils [req-ff650cdc-8ba6-4826-8f6a-34daac6d8284 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "_locked_do_build_and_run_instance" :: held 76.843s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:15.183 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:15.417 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "update_usage" :: held 0.235s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:15.418 DEBUG nova.compute.utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:26:15.425 13318 DEBUG nova.compute.manager [-] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:26:15.426 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-790f2093-5329-49fd-a0c7-ab1fe4c523c9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:26:16.625 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:26:16.648 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:26:16.649 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:17.726 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:26:17.782 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:19.917 13318 DEBUG nova.network.base_api [-] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a4:ce:44', 'active': False, 'type': u'bridge', 'id': u'819e4860-98f0-46f8-95fb-edec41f36842', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:26:19.946 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-790f2093-5329-49fd-a0c7-ab1fe4c523c9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:26:19.946 13318 DEBUG nova.compute.manager [-] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a4:ce:44', 'active': False, 'type': u'bridge', 'id': u'819e4860-98f0-46f8-95fb-edec41f36842', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:26:20.511 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:20.599 INFO nova.compute.manager [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Starting instance... 2015-08-07 17:26:20.996 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:20.997 DEBUG nova.compute.resource_tracker [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:26:21.007 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:26:21.009 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:26:21.009 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:26:21.010 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:26:21.010 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] disk limit not specified, defaulting to unlimited 2015-08-07 17:26:21.032 DEBUG nova.compute.resources.vcpu [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:26:21.033 DEBUG nova.compute.resources.vcpu [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:26:21.033 INFO nova.compute.claims [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Claim successful 2015-08-07 17:26:21.238 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Cloned VDI OpaqueRef:518edd9b-8c33-a46f-c7a8-1b9def3c9506 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:26:21.844 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "instance_claim" :: held 0.848s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:21.845 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "710ecfc3-7fd5-410f-b512-206783c2241e" acquired by "_do_validation" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:22.013 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "710ecfc3-7fd5-410f-b512-206783c2241e" released by "_do_validation" :: held 0.168s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:22.463 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:22.570 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "update_usage" :: held 0.108s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:22.571 DEBUG nova.compute.utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:26:22.613 13318 DEBUG nova.compute.manager [-] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:26:22.614 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-c5d13a83-4e8d-4d99-9630-b5219ac62190" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:26:22.734 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpQBagd_/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 13.614s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:22.738 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:22.968 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.185s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:23.022 INFO nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Image creation data, cacheable: True, downloaded: False duration: 5.29 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:26:23.449 DEBUG oslo_concurrency.processutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.711s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:23.450 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Destroying VBD for VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:26:23.451 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:23.492 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:23.598 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 4 instances in the database and 2 instances on the hypervisor. 2015-08-07 17:26:23.601 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 1da4a346-da42-46e7-81c1-b0085c1ca90a _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:26:23.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 18c3ad3b-8cd4-4e41-b278-93a63e82aac4 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:26:23.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 790f2093-5329-49fd-a0c7-ab1fe4c523c9 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:26:23.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid c5d13a83-4e8d-4d99-9630-b5219ac62190 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:26:23.604 13318 DEBUG oslo_concurrency.lockutils [-] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:23.607 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:23.608 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:26:23.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:26:23.734 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:26:23.859 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:26:23.859 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:26:23.861 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-1da4a346-da42-46e7-81c1-b0085c1ca90a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:26:23.862 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 1da4a346-da42-46e7-81c1-b0085c1ca90a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:23.867 13318 DEBUG oslo_concurrency.lockutils [-] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "query_driver_power_state_and_sync" :: held 0.263s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:24.254 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:26:24.265 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.13'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c7:1e:2f', 'active': False, 'type': u'bridge', 'id': u'b501abb4-30aa-4235-abb8-ff79d2316ad7', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:26:24.299 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-1da4a346-da42-46e7-81c1-b0085c1ca90a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:26:24.299 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:26:24.301 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:26:24.301 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:24.328 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:25.060 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:25.376 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:25.403 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:26:25.419 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:25.990 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:26.287 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:26:26.317 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:26:26.318 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:26.773 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Auto configuring disk, attempting to resize root disk... _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:721 2015-08-07 17:26:26.775 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Skipping auto_config_disk as destination size is 0GB _auto_configure_disk /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:960 2015-08-07 17:26:26.775 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:7a28be96-dd76-dfd3-9025-4faeca699d6b, VDI OpaqueRef:518edd9b-8c33-a46f-c7a8-1b9def3c9506 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:26.798 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:735e8e36-294a-f63b-dbbb-30eb35813cca for VM OpaqueRef:7a28be96-dd76-dfd3-9025-4faeca699d6b, VDI OpaqueRef:518edd9b-8c33-a46f-c7a8-1b9def3c9506. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:27.116 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.665s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:27.125 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Destroying VBD for VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:26:27.126 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Creating disk-type VBD for VM OpaqueRef:65080da7-3267-c7b6-379d-ef255c1a4bf9, VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:27.146 DEBUG nova.virt.xenapi.vm_utils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Created VBD OpaqueRef:06991580-0100-d22a-e238-2cf7ed5f91b8 for VM OpaqueRef:65080da7-3267-c7b6-379d-ef255c1a4bf9, VDI OpaqueRef:9ebe5046-c0e9-5212-bbcc-33fd12a08c7c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:27.147 DEBUG nova.objects.instance [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `pci_devices' on Instance uuid 18c3ad3b-8cd4-4e41-b278-93a63e82aac4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:27.260 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:27.569 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Cloned VDI OpaqueRef:eb55e87c-dffe-9f88-516f-9dec2ffb8cb5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:26:27.594 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:27.595 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:27.596 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:27.635 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "store_auto_disk_config" :: held 0.039s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:27.635 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Injecting hostname (tempest.common.compute-instance-1496865155) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:26:27.636 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:27.644 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:27.645 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:26:27.645 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:27.995 13318 DEBUG nova.network.base_api [-] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:80:1d:c6', 'active': False, 'type': u'bridge', 'id': u'4d9d612b-2736-4a4b-98ce-bfc72683a948', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:26:28.019 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "update_nwinfo" :: held 0.373s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:28.020 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:28.028 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-c5d13a83-4e8d-4d99-9630-b5219ac62190" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:26:28.029 13318 DEBUG nova.compute.manager [-] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:80:1d:c6', 'active': False, 'type': u'bridge', 'id': u'4d9d612b-2736-4a4b-98ce-bfc72683a948', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:26:28.355 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:28.356 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:28.357 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.15 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:28.470 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:26:28.480 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:26:28.490 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Created VIF OpaqueRef:0ff4bedd-9dfb-df06-318a-0d54d96e2c50, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:26:28.491 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:28.765 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.345s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:28.766 INFO nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Image creation data, cacheable: True, downloaded: False duration: 3.36 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:26:29.031 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:26:29.805 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:30.152 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:30.617 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:26:30.663 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:26:30.670 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:30.981 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:f4bd33b2-17c4-7e44-2971-b980ce16edb0, VDI OpaqueRef:eb55e87c-dffe-9f88-516f-9dec2ffb8cb5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:30.994 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:8abc1ced-edb6-e611-0a09-95b4ddc2bd82 for VM OpaqueRef:f4bd33b2-17c4-7e44-2971-b980ce16edb0, VDI OpaqueRef:eb55e87c-dffe-9f88-516f-9dec2ffb8cb5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:31.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:31.511 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:31.583 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:26:31.587 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:31.606 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:55847293-5b45-8cd8-c149-7db4092556a4 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:31.607 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Plugging VBD OpaqueRef:55847293-5b45-8cd8-c149-7db4092556a4 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:26:31.607 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:32.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:32.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:34.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:34.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:26:34.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:34.882 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:35.505 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.897s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:35.505 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Plugging VBD OpaqueRef:55847293-5b45-8cd8-c149-7db4092556a4 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:26:35.514 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VBD OpaqueRef:55847293-5b45-8cd8-c149-7db4092556a4 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:26:35.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:35.553 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:26:35.554 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:26:35.612 WARNING nova.virt.configdrive [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:26:35.615 DEBUG nova.objects.instance [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `ec2_ids' on Instance uuid c5d13a83-4e8d-4d99-9630-b5219ac62190 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:35.675 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): genisoimage -o /tmp/tmpolQpBR/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjFuRdm execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:35.976 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:35.978 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:26:35.999 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "genisoimage -o /tmp/tmpolQpBR/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjFuRdm" returned: 0 in 0.324s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:36.004 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpolQpBR/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:36.725 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.749s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:37.054 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:26:37.055 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:26:37.055 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:26:37.056 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:37.437 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:26:37.438 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=788MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:26:37.568 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:26:37.569 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.513s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:37.570 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:37.571 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:42.700 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:26:42.852 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:43.346 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:26:43.346 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:26:43.347 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:43.353 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "xenstore-18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:43.354 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:43.510 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:26:43.515 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:43.529 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:83d1e849-d49c-172b-49f7-c48cdf3af78d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:43.530 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Plugging VBD OpaqueRef:83d1e849-d49c-172b-49f7-c48cdf3af78d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:26:43.530 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:43.756 DEBUG nova.virt.xenapi.vmops [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:44.226 DEBUG nova.compute.manager [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:26:44.788 DEBUG oslo_concurrency.lockutils [req-f09963e6-bcc8-4833-9f5f-a22ce128157f tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "_locked_do_build_and_run_instance" :: held 55.709s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:44.790 13318 DEBUG oslo_concurrency.lockutils [-] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "query_driver_power_state_and_sync" :: waited 21.184s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:44.790 13318 INFO nova.compute.manager [-] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:26:44.790 13318 DEBUG oslo_concurrency.lockutils [-] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:44.823 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:45.566 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:26:45.567 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:46.238 DEBUG oslo_concurrency.lockutils [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:46.239 DEBUG nova.compute.manager [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:26:46.244 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.713s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:46.244 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Plugging VBD OpaqueRef:83d1e849-d49c-172b-49f7-c48cdf3af78d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:26:46.249 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VBD OpaqueRef:83d1e849-d49c-172b-49f7-c48cdf3af78d plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:26:46.271 DEBUG nova.compute.manager [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:26:46.284 DEBUG nova.virt.xenapi.vm_utils [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:26:46.342 WARNING nova.virt.configdrive [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:26:46.342 DEBUG nova.objects.instance [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `ec2_ids' on Instance uuid 790f2093-5329-49fd-a0c7-ab1fe4c523c9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:46.403 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): genisoimage -o /tmp/tmp2u6x01/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpgmuUt6 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:46.660 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "genisoimage -o /tmp/tmp2u6x01/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpgmuUt6" returned: 0 in 0.256s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:46.666 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2u6x01/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:47.844 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpolQpBR/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 11.840s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:47.849 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:48.841 DEBUG oslo_concurrency.processutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.992s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:48.846 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Destroying VBD for VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:26:48.847 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:50.686 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.839s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:50.697 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Destroying VBD for VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:26:50.698 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Creating disk-type VBD for VM OpaqueRef:f4bd33b2-17c4-7e44-2971-b980ce16edb0, VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:26:50.710 DEBUG nova.virt.xenapi.vm_utils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Created VBD OpaqueRef:9185fa44-e854-b306-5baa-8751fe237235 for VM OpaqueRef:f4bd33b2-17c4-7e44-2971-b980ce16edb0, VDI OpaqueRef:84b5d776-d995-36ac-f46d-900416afb6fb. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:26:50.711 DEBUG nova.objects.instance [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `pci_devices' on Instance uuid c5d13a83-4e8d-4d99-9630-b5219ac62190 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:26:50.881 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:51.424 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:51.425 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:51.425 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:51.440 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:51.441 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Injecting hostname (tempest.common.compute-instance-1210849909) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:26:51.441 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:51.463 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "update_hostname" :: held 0.022s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:51.464 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:26:51.465 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:51.748 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "update_nwinfo" :: held 0.283s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:51.749 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:52.476 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:26:52.494 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:26:52.504 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Created VIF OpaqueRef:bcc20b73-e99d-a998-cf54-16bafe35ab47, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:26:52.505 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:26:52.667 DEBUG nova.compute.manager [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:26:52.917 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:26:53.235 DEBUG oslo_concurrency.lockutils [req-276c17a3-08d3-4598-9069-f23a424011a9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "do_stop_instance" :: held 6.997s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:54.486 DEBUG nova.compute.manager [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:26:54.757 INFO nova.compute.manager [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] instance snapshotting 2015-08-07 17:26:54.757 WARNING nova.compute.manager [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] trying to snapshot a non-running instance: (state: 4 expected: 1) 2015-08-07 17:26:54.774 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:26:54.847 DEBUG oslo_concurrency.lockutils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:26:54.928 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:26:54.975 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:26:56.103 DEBUG oslo_concurrency.lockutils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.256s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:26:56.117 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 268b7ab5-137a-43f6-a9bc-60a2e0b7331e has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.153 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 73185015-1e85-49cb-ad30-f5f20d9b37e0 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.160 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 983a9ff2-683b-4b37-95bd-465b43748114 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.179 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 268b7ab5-137a-43f6-a9bc-60a2e0b7331e has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.209 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 2851ab9b-7e9b-4c49-bb23-d31b117db817 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.222 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:26:56.234 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:26:58.110 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2u6x01/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 11.445s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:58.112 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:26:59.248 DEBUG oslo_concurrency.processutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.135s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:26:59.252 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Destroying VBD for VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:26:59.254 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:00.065 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:27:00.087 DEBUG oslo_concurrency.lockutils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:00.088 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:27:01.223 DEBUG oslo_concurrency.lockutils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:01.248 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 0a523267-43a8-42b7-bd41-e074e02049f1 has parent 62d7ae43-d5cc-40e6-b807-de033e9c58d8 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:27:01.282 DEBUG nova.virt.xenapi.vm_utils [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VHD 62d7ae43-d5cc-40e6-b807-de033e9c58d8 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:27:01.319 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.065s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:01.327 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Destroying VBD for VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:27:01.328 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:7a28be96-dd76-dfd3-9025-4faeca699d6b, VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:01.337 DEBUG nova.virt.xenapi.vm_utils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:12d54793-6d79-570b-1f16-285a8cbdf9f5 for VM OpaqueRef:7a28be96-dd76-dfd3-9025-4faeca699d6b, VDI OpaqueRef:e55a84b6-e0be-4073-a2bf-ee7f10700667. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:01.338 DEBUG nova.objects.instance [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `pci_devices' on Instance uuid 790f2093-5329-49fd-a0c7-ab1fe4c523c9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:01.562 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:02.255 DEBUG nova.virt.xenapi.client.session [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:27:02.483 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:02.496 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "store_meta" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:02.497 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:02.512 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "store_auto_disk_config" :: held 0.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:02.514 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Injecting hostname (tempest-server-416290216) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:27:02.515 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:02.523 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:02.524 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:27:02.525 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:02.949 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "update_nwinfo" :: held 0.424s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:02.961 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:03.401 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:27:03.408 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:27:03.419 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Created VIF OpaqueRef:56240ecb-4619-d4ca-e7e3-ead0baa615e2, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:27:03.420 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:03.847 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:27:04.904 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:10.518 DEBUG nova.virt.xenapi.vmops [req-4d5bc629-8a54-4cf7-9a3d-f9947c915db2 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Finished snapshot and upload for VM, duration: 15.74 secs for image 83ea94ae-82e0-4363-8054-c9cef3f7aacb snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:27:11.422 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:27:11.490 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:11.920 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:27:11.921 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:27:11.922 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:11.951 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "xenstore-c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "update_hostname" :: held 0.029s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:11.952 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:12.266 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:12.267 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:12.267 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:12.270 INFO nova.compute.manager [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Terminating instance 2015-08-07 17:27:12.273 INFO nova.virt.xenapi.vmops [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Destroying VM 2015-08-07 17:27:12.482 WARNING nova.virt.xenapi.vm_utils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] VM already halted, skipping shutdown... 2015-08-07 17:27:12.539 DEBUG nova.virt.xenapi.vmops [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:27:12.555 DEBUG nova.virt.xenapi.vm_utils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VDI 268b7ab5-137a-43f6-a9bc-60a2e0b7331e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:12.580 DEBUG nova.virt.xenapi.vmops [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:12.594 DEBUG nova.virt.xenapi.vm_utils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] VDI 25850648-e2ed-4d43-a81b-14e0760ce76f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:13.802 DEBUG nova.compute.manager [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:27:14.709 DEBUG oslo_concurrency.lockutils [req-da7c0e47-5c03-43ed-995f-3e3443707176 tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "_locked_do_build_and_run_instance" :: held 54.196s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:14.710 13318 DEBUG oslo_concurrency.lockutils [-] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "query_driver_power_state_and_sync" :: waited 51.102s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:14.710 13318 INFO nova.compute.manager [-] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] During sync_power_state the instance has a pending task (block_device_mapping). Skip. 2015-08-07 17:27:14.710 13318 DEBUG oslo_concurrency.lockutils [-] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:14.946 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:15.132 DEBUG nova.virt.xenapi.vmops [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:27:15.174 DEBUG nova.virt.xenapi.vm_utils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:27:15.175 DEBUG nova.compute.manager [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:27:17.575 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:17.576 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:17.576 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:17.578 INFO nova.compute.manager [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Terminating instance 2015-08-07 17:27:17.580 INFO nova.virt.xenapi.vmops [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Destroying VM 2015-08-07 17:27:17.659 DEBUG nova.virt.xenapi.vm_utils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:27:19.105 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:19.106 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:19.107 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:19.109 INFO nova.compute.manager [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Terminating instance 2015-08-07 17:27:19.110 INFO nova.virt.xenapi.vmops [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Destroying VM 2015-08-07 17:27:19.181 DEBUG nova.virt.xenapi.vm_utils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:27:19.770 DEBUG nova.compute.manager [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:25:48Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=22,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=18c3ad3b-8cd4-4e41-b278-93a63e82aac4,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:25:50Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:27:20.040 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:20.041 DEBUG nova.objects.instance [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lazy-loading `numa_topology' on Instance uuid 18c3ad3b-8cd4-4e41-b278-93a63e82aac4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:20.180 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "compute_resources" released by "update_usage" :: held 0.140s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:20.772 DEBUG oslo_concurrency.lockutils [req-46710dba-0f3e-4bc1-8c7b-307e8ca969d9 tempest-ImagesNegativeTestJSON-1431277351 tempest-ImagesNegativeTestJSON-1581008911] Lock "18c3ad3b-8cd4-4e41-b278-93a63e82aac4" released by "do_terminate_instance" :: held 8.505s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:23.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:23.519 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:27:23.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:27:23.594 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:27:23.595 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:27:23.595 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:27:23.596 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:27:23.596 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:25.344 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:27:25.387 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.36 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:25.539 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:25.591 DEBUG nova.virt.xenapi.vmops [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:27:25.623 DEBUG nova.virt.xenapi.vm_utils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VDI 2851ab9b-7e9b-4c49-bb23-d31b117db817 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:25.637 DEBUG nova.virt.xenapi.vm_utils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VDI b221e05c-28b4-4e03-b1ad-d21c2a68f036 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:26.511 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:26.646 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:27.267 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:27:27.268 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:27:27.269 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:27.275 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:27.276 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:27.930 DEBUG nova.virt.xenapi.vmops [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:28.067 DEBUG nova.virt.xenapi.vmops [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:27:28.101 DEBUG nova.virt.xenapi.vm_utils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:27:28.102 DEBUG nova.compute.manager [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:27:28.376 DEBUG nova.compute.manager [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:27:28.392 DEBUG nova.virt.xenapi.vmops [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:27:28.423 DEBUG nova.virt.xenapi.vm_utils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VDI 73185015-1e85-49cb-ad30-f5f20d9b37e0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:28.437 DEBUG nova.virt.xenapi.vm_utils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] VDI 9bfd8f29-0926-4b36-9058-5c7403d1011a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:27:28.655 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:28.656 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:28.657 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:28.877 DEBUG oslo_concurrency.lockutils [req-f6418dc0-90de-47db-819a-f5c9b2aeec01 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "_locked_do_build_and_run_instance" :: held 75.354s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:28.878 13318 DEBUG oslo_concurrency.lockutils [-] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "query_driver_power_state_and_sync" :: waited 65.271s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:28.878 13318 INFO nova.compute.manager [-] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:27:28.878 13318 DEBUG oslo_concurrency.lockutils [-] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:29.401 DEBUG nova.virt.xenapi.vmops [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:27:29.415 DEBUG nova.virt.xenapi.vm_utils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:27:29.415 DEBUG nova.compute.manager [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:27:30.068 DEBUG nova.compute.manager [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:24:57Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=21,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1da4a346-da42-46e7-81c1-b0085c1ca90a,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:25:00Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:27:30.322 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:30.323 DEBUG nova.objects.instance [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `numa_topology' on Instance uuid 1da4a346-da42-46e7-81c1-b0085c1ca90a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:30.506 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "update_usage" :: held 0.184s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:30.892 DEBUG oslo_concurrency.lockutils [req-7ae80b1f-28a9-4a22-88ed-c1587863a9cc tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "1da4a346-da42-46e7-81c1-b0085c1ca90a" released by "do_terminate_instance" :: held 13.317s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:31.362 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:31.549 INFO nova.compute.manager [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Starting instance... 2015-08-07 17:27:31.981 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:31.982 DEBUG nova.compute.resource_tracker [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:27:31.994 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:27:31.995 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:27:31.996 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:27:31.996 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:27:31.998 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] disk limit not specified, defaulting to unlimited 2015-08-07 17:27:32.051 DEBUG nova.compute.resources.vcpu [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:27:32.053 DEBUG nova.compute.resources.vcpu [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:27:32.054 INFO nova.compute.claims [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Claim successful 2015-08-07 17:27:32.162 DEBUG nova.compute.manager [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:26:19Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=24,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=c5d13a83-4e8d-4d99-9630-b5219ac62190,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:26:22Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:27:32.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:32.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:32.586 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" released by "instance_claim" :: held 0.605s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:32.593 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" acquired by "update_usage" :: waited 0.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:32.595 DEBUG nova.objects.instance [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lazy-loading `numa_topology' on Instance uuid c5d13a83-4e8d-4d99-9630-b5219ac62190 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:32.680 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:32.706 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "compute_resources" released by "update_usage" :: held 0.114s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:32.732 INFO nova.compute.manager [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Starting instance... 2015-08-07 17:27:32.980 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:32.987 DEBUG nova.compute.resource_tracker [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:27:32.996 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:27:32.997 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:27:32.997 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:27:32.998 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:27:32.998 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] disk limit not specified, defaulting to unlimited 2015-08-07 17:27:33.054 DEBUG nova.compute.resources.vcpu [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:27:33.055 DEBUG nova.compute.resources.vcpu [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:27:33.055 INFO nova.compute.claims [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Claim successful 2015-08-07 17:27:33.271 DEBUG oslo_concurrency.lockutils [req-d800e70e-d508-4a79-ae96-d50e6e6abb7c tempest-ServersTestManualDisk-613306625 tempest-ServersTestManualDisk-1538886283] Lock "c5d13a83-4e8d-4d99-9630-b5219ac62190" released by "do_terminate_instance" :: held 14.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:33.448 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "instance_claim" :: held 0.468s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:33.448 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a86bc444-c132-44e2-8eb3-be6aa73029aa" acquired by "_do_validation" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:33.451 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" acquired by "update_usage" :: waited 0.345s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:33.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:33.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:33.560 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a86bc444-c132-44e2-8eb3-be6aa73029aa" released by "_do_validation" :: held 0.112s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:33.588 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" released by "update_usage" :: held 0.137s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:33.589 DEBUG nova.compute.utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:27:33.594 13318 DEBUG nova.compute.manager [-] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:27:33.595 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1cff7b75-3705-4f14-9e5a-546d1797c17f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:27:33.857 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:34.165 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "update_usage" :: held 0.307s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:34.174 DEBUG nova.compute.utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:27:34.179 13318 DEBUG nova.compute.manager [-] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:27:34.180 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-a820417c-8cc3-4f15-b639-5d258d9e6d1b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:27:34.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:34.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:27:34.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:34.740 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:27:34.769 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:27:34.769 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:34.910 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:35.075 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:27:35.090 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:35.268 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:27:35.286 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:27:35.289 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:35.648 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:27:36.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:36.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:36.621 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Cloned VDI OpaqueRef:aa0585ae-d6f7-1a31-f81d-715070dd524a from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:27:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:27:37.560 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:27:37.560 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:27:37.618 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.528s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:37.619 INFO nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Image creation data, cacheable: True, downloaded: False duration: 2.54 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:27:37.620 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 1.958s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:38.056 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:38.056 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:27:38.133 13318 DEBUG nova.network.base_api [-] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:20:e1:42', 'active': False, 'type': u'bridge', 'id': u'53bcce2a-273e-48e6-ac2c-ff94a5dbcb4c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:27:38.171 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1cff7b75-3705-4f14-9e5a-546d1797c17f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:27:38.171 13318 DEBUG nova.compute.manager [-] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:20:e1:42', 'active': False, 'type': u'bridge', 'id': u'53bcce2a-273e-48e6-ac2c-ff94a5dbcb4c', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:27:39.037 13318 DEBUG nova.network.base_api [-] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ee:94:a4', 'active': False, 'type': u'bridge', 'id': u'77528922-443a-4137-80e5-7a31f5a70115', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:27:39.067 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-a820417c-8cc3-4f15-b639-5d258d9e6d1b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:27:39.067 13318 DEBUG nova.compute.manager [-] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ee:94:a4', 'active': False, 'type': u'bridge', 'id': u'77528922-443a-4137-80e5-7a31f5a70115', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:27:39.583 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Cloned VDI OpaqueRef:6f1d4888-42a9-381a-b48e-ffc7c9d9a2e1 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:27:39.863 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:40.136 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:40.419 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:27:40.423 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.804s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:40.424 INFO nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Image creation data, cacheable: True, downloaded: False duration: 4.78 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:27:40.435 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:27:40.436 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:40.659 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Creating disk-type VBD for VM OpaqueRef:7d23e26f-73c7-ce96-da79-ca5438ca5212, VDI OpaqueRef:aa0585ae-d6f7-1a31-f81d-715070dd524a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:40.672 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Created VBD OpaqueRef:49bdf516-70a2-a777-aceb-291bf7f78ad0 for VM OpaqueRef:7d23e26f-73c7-ce96-da79-ca5438ca5212, VDI OpaqueRef:aa0585ae-d6f7-1a31-f81d-715070dd524a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:41.065 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Created VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:27:41.069 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:41.083 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Created VBD OpaqueRef:9ddcf6fe-33c2-259c-b3b6-b999fb2357b1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:41.084 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Plugging VBD OpaqueRef:9ddcf6fe-33c2-259c-b3b6-b999fb2357b1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:27:41.085 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:41.210 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:41.483 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:41.743 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:27:41.755 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:27:41.756 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:41.962 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:42.144 INFO nova.compute.manager [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance... 2015-08-07 17:27:42.285 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:b1b42299-98f2-f647-7fb2-b388ce70c103, VDI OpaqueRef:6f1d4888-42a9-381a-b48e-ffc7c9d9a2e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:42.294 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:3c588944-9951-dee5-d53d-506799544c10 for VM OpaqueRef:b1b42299-98f2-f647-7fb2-b388ce70c103, VDI OpaqueRef:6f1d4888-42a9-381a-b48e-ffc7c9d9a2e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:42.460 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:42.461 DEBUG nova.compute.resource_tracker [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:27:42.469 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:27:42.470 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:27:42.470 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:27:42.471 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:27:42.471 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] disk limit not specified, defaulting to unlimited 2015-08-07 17:27:42.495 DEBUG nova.compute.resources.vcpu [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:27:42.496 DEBUG nova.compute.resources.vcpu [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:27:42.497 INFO nova.compute.claims [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Claim successful 2015-08-07 17:27:42.855 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:27:42.857 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.772s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:42.858 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Plugging VBD OpaqueRef:9ddcf6fe-33c2-259c-b3b6-b999fb2357b1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:27:42.871 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:42.873 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VBD OpaqueRef:9ddcf6fe-33c2-259c-b3b6-b999fb2357b1 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:27:42.883 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:9aceab49-12b9-4d35-0e31-e5010a114cf8 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:42.884 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Plugging VBD OpaqueRef:9aceab49-12b9-4d35-0e31-e5010a114cf8 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:27:42.885 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:42.890 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "instance_claim" :: held 0.430s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:42.959 WARNING nova.virt.configdrive [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:27:42.959 DEBUG nova.objects.instance [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lazy-loading `ec2_ids' on Instance uuid 1cff7b75-3705-4f14-9e5a-546d1797c17f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:42.998 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Running cmd (subprocess): genisoimage -o /tmp/tmpSIcy1m/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpnvLENj execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:43.113 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] CMD "genisoimage -o /tmp/tmpSIcy1m/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpnvLENj" returned: 0 in 0.115s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:43.125 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSIcy1m/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:43.242 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:43.378 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "update_usage" :: held 0.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:43.380 DEBUG nova.compute.utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:27:43.387 13318 DEBUG nova.compute.manager [-] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:27:43.389 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:27:44.178 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:27:44.201 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:27:44.203 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:44.926 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:44.953 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:27:44.970 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:44.994 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.110s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:44.996 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Plugging VBD OpaqueRef:9aceab49-12b9-4d35-0e31-e5010a114cf8 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:27:45.002 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VBD OpaqueRef:9aceab49-12b9-4d35-0e31-e5010a114cf8 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:27:45.106 WARNING nova.virt.configdrive [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:27:45.107 DEBUG nova.objects.instance [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `ec2_ids' on Instance uuid a820417c-8cc3-4f15-b639-5d258d9e6d1b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:45.160 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): genisoimage -o /tmp/tmpVwPe3O/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpLkeed1 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:45.296 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "genisoimage -o /tmp/tmpVwPe3O/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpLkeed1" returned: 0 in 0.136s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:45.302 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVwPe3O/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:46.651 13318 DEBUG nova.network.base_api [-] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:27:46.690 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:27:46.691 13318 DEBUG nova.compute.manager [-] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:27:48.828 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Cloned VDI OpaqueRef:42fd3244-a209-672e-8b69-7da6404c2ea2 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:27:49.864 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.894s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:49.865 INFO nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Image creation data, cacheable: True, downloaded: False duration: 4.91 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:27:50.789 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:51.071 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:51.455 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:27:51.469 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:27:51.470 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:27:51.994 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:3a672dd2-81cd-b514-2105-de769e3305a8, VDI OpaqueRef:42fd3244-a209-672e-8b69-7da6404c2ea2 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:52.004 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:d9420a1e-5091-6459-a397-ebd1fbbe6244 for VM OpaqueRef:3a672dd2-81cd-b514-2105-de769e3305a8, VDI OpaqueRef:42fd3244-a209-672e-8b69-7da6404c2ea2. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:53.166 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 15.110s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:53.442 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:27:53.442 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:27:53.443 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:27:53.444 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:53.844 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:27:53.845 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=788MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:27:54.189 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:27:54.189 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.746s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:54.192 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:54.937 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:27:58.259 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSIcy1m/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 15.134s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:58.261 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:58.705 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVwPe3O/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 13.402s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:58.710 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:27:58.893 DEBUG oslo_concurrency.processutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.631s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:58.896 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Destroying VBD for VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:27:58.898 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:59.099 DEBUG oslo_concurrency.processutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.390s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:27:59.100 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Destroying VBD for VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:27:59.695 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.797s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:27:59.696 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.596s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:27:59.705 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Destroying VBD for VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:27:59.706 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Creating disk-type VBD for VM OpaqueRef:7d23e26f-73c7-ce96-da79-ca5438ca5212, VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:27:59.729 DEBUG nova.virt.xenapi.vm_utils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Created VBD OpaqueRef:ba9ca5bf-4f05-9d4e-6ccf-25c8830e23c1 for VM OpaqueRef:7d23e26f-73c7-ce96-da79-ca5438ca5212, VDI OpaqueRef:4afff6ca-5c7e-b312-abef-1c931b2e9d0d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:27:59.730 DEBUG nova.objects.instance [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lazy-loading `pci_devices' on Instance uuid 1cff7b75-3705-4f14-9e5a-546d1797c17f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:27:59.896 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:00.224 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:00.225 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:00.225 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:00.236 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:00.237 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Injecting hostname (tempest.common.compute-instance-445081443) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:28:00.238 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:00.249 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:00.250 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:28:00.250 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:00.442 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:28:00.447 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:28:00.452 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "update_nwinfo" :: held 0.202s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:00.453 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:00.462 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:b04fb926-72d8-1b37-1c66-552f9d2b0f7d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:28:00.463 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:b04fb926-72d8-1b37-1c66-552f9d2b0f7d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:28:00.648 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.952s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:00.651 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.186s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:00.676 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Destroying VBD for VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:28:00.677 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Creating disk-type VBD for VM OpaqueRef:b1b42299-98f2-f647-7fb2-b388ce70c103, VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:28:00.691 DEBUG nova.virt.xenapi.vm_utils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Created VBD OpaqueRef:f4ae17e9-bfa4-d378-da7d-d5e85f2087e7 for VM OpaqueRef:b1b42299-98f2-f647-7fb2-b388ce70c103, VDI OpaqueRef:9eb7ed2e-8a93-e886-af2b-ed951d25c1c3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:28:00.692 DEBUG nova.objects.instance [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `pci_devices' on Instance uuid a820417c-8cc3-4f15-b639-5d258d9e6d1b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:00.704 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:28:00.715 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:28:00.738 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Created VIF OpaqueRef:68a2cf25-3119-287c-febb-21a78edc2a34, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:28:00.739 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:00.836 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:01.122 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:28:01.148 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:01.149 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:01.151 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:01.195 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "store_auto_disk_config" :: held 0.044s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:01.196 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Injecting hostname (tempest.common.compute-instance-1922481953) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:28:01.196 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:01.217 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "update_hostname" :: held 0.020s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:01.217 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:28:01.218 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:01.649 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "update_nwinfo" :: held 0.431s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:01.650 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:02.192 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:02.194 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 23.32 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:02.473 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:28:02.486 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:28:02.526 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Created VIF OpaqueRef:4b843d25-f6de-78c3-2829-d814d6ad57c9, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:28:02.527 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:02.888 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:28:04.068 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.417s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:04.068 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:b04fb926-72d8-1b37-1c66-552f9d2b0f7d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:28:04.080 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VBD OpaqueRef:b04fb926-72d8-1b37-1c66-552f9d2b0f7d plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:28:04.185 WARNING nova.virt.configdrive [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:28:04.186 DEBUG nova.objects.instance [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `ec2_ids' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:04.227 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): genisoimage -o /tmp/tmp68h3h_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmptsuWZL execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:28:04.327 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "genisoimage -o /tmp/tmp68h3h_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmptsuWZL" returned: 0 in 0.100s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:28:04.335 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp68h3h_/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:28:05.073 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:14.668 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp68h3h_/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.333s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:28:14.670 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:28:15.014 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:15.366 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:28:15.426 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:15.464 DEBUG oslo_concurrency.processutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.794s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:28:15.466 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:28:15.467 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:15.857 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:28:15.858 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:28:15.862 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "update_hostname" :: waited 0.003s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:15.872 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "xenstore-1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:15.879 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:16.467 DEBUG nova.virt.xenapi.vmops [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:16.852 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:28:16.932 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:17.174 DEBUG nova.compute.manager [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:17.764 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:28:17.765 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:28:17.766 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:17.826 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "xenstore-a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "update_hostname" :: held 0.060s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:17.827 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:17.902 DEBUG oslo_concurrency.lockutils [req-a6873810-8063-4f91-a363-7c3bebddad17 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "_locked_do_build_and_run_instance" :: held 46.538s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:18.100 DEBUG nova.virt.xenapi.vmops [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:18.198 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.731s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:18.210 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:28:18.211 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:3a672dd2-81cd-b514-2105-de769e3305a8, VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:28:18.223 DEBUG nova.virt.xenapi.vm_utils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:f8f51f05-c369-9dad-1257-29d8428fac2c for VM OpaqueRef:3a672dd2-81cd-b514-2105-de769e3305a8, VDI OpaqueRef:0b0d4ad3-6862-0d3f-b7af-b5cd5a3afc4c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:28:18.223 DEBUG nova.objects.instance [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `pci_devices' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:18.392 DEBUG nova.compute.manager [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:18.399 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:18.743 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:18.744 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:18.745 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:18.759 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:18.760 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting hostname (tempest.common.compute-instance-738647160) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:28:18.764 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:18.774 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:18.781 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:28:18.784 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:18.905 DEBUG oslo_concurrency.lockutils [req-762da9e3-09ac-433d-8e65-e7265e76831b tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "_locked_do_build_and_run_instance" :: held 46.224s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:19.070 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_nwinfo" :: held 0.286s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:19.071 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:19.342 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:28:19.350 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:28:19.358 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VIF OpaqueRef:1ed42344-072c-7d65-1336-480935f31be3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:28:19.359 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:19.885 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:28:20.997 DEBUG nova.compute.manager [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:21.238 INFO nova.compute.manager [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] instance snapshotting 2015-08-07 17:28:21.244 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:28:21.251 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:21.252 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:21.253 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:21.255 INFO nova.compute.manager [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Terminating instance 2015-08-07 17:28:21.261 INFO nova.virt.xenapi.vmops [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Destroying VM 2015-08-07 17:28:21.274 DEBUG nova.virt.xenapi.vm_utils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:28:21.298 DEBUG oslo_concurrency.lockutils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:21.299 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:22.013 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:22.014 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:22.014 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:22.016 INFO nova.compute.manager [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Terminating instance 2015-08-07 17:28:22.018 INFO nova.virt.xenapi.vmops [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Destroying VM 2015-08-07 17:28:22.028 DEBUG nova.virt.xenapi.vm_utils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:28:22.489 DEBUG oslo_concurrency.lockutils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.190s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:22.501 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.532 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 0fcf28ae-7bfd-4d8a-8f0a-0f2f6dce627b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.556 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 273c724d-e228-4091-8dda-e60ea172ccd7 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.564 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.575 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 983a9ff2-683b-4b37-95bd-465b43748114 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.590 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:22.603 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:28:24.894 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:25.282 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:28:25.297 DEBUG oslo_concurrency.lockutils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:25.299 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:25.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:25.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:28:25.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:28:25.598 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:28:25.599 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:28:25.599 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:28:25.601 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-1cff7b75-3705-4f14-9e5a-546d1797c17f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:28:25.602 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 1cff7b75-3705-4f14-9e5a-546d1797c17f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:25.923 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:20:e1:42', 'active': False, 'type': u'bridge', 'id': u'53bcce2a-273e-48e6-ac2c-ff94a5dbcb4c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:28:25.978 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-1cff7b75-3705-4f14-9e5a-546d1797c17f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:28:25.979 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:28:25.980 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:26.356 DEBUG oslo_concurrency.lockutils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.059s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:26.370 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 63da33a7-dedb-4757-9dd9-110cb0af2a87 has parent cb0c7a09-9e0a-43d0-81da-1741b6ef562a _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:26.381 DEBUG nova.virt.xenapi.vm_utils [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD cb0c7a09-9e0a-43d0-81da-1741b6ef562a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:26.754 DEBUG nova.virt.xenapi.client.session [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:28:28.980 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:28.980 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:28.981 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.53 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:29.210 DEBUG nova.virt.xenapi.vmops [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:28:29.221 DEBUG nova.virt.xenapi.vm_utils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VDI 983a9ff2-683b-4b37-95bd-465b43748114 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:29.241 DEBUG nova.virt.xenapi.vm_utils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VDI 71ad5b41-d267-47c8-9bd0-e2e25ca54286 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:30.193 DEBUG nova.virt.xenapi.vmops [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:28:30.214 DEBUG nova.virt.xenapi.vm_utils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:28:30.215 DEBUG nova.compute.manager [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:28:32.394 DEBUG nova.virt.xenapi.vmops [req-becbc58d-92fb-4b14-bde2-af9649ef9c1f tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Finished snapshot and upload for VM, duration: 11.15 secs for image f701288d-afb1-47a4-9228-b4324849b7cb snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:28:32.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:32.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:32.621 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:28:32.760 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:33.342 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:28:33.343 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:28:33.343 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:33.355 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:33.355 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:33.462 DEBUG nova.compute.manager [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:26:12Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=23,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=790f2093-5329-49fd-a0c7-ab1fe4c523c9,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:26:15Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:28:33.768 DEBUG nova.virt.xenapi.vmops [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:33.801 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:33.802 DEBUG nova.objects.instance [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `numa_topology' on Instance uuid 790f2093-5329-49fd-a0c7-ab1fe4c523c9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:33.967 DEBUG nova.virt.xenapi.vmops [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:28:33.992 DEBUG nova.virt.xenapi.vm_utils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VDI 273c724d-e228-4091-8dda-e60ea172ccd7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:34.017 DEBUG nova.virt.xenapi.vm_utils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] VDI c382b83d-2aba-4f9b-9c5e-1fe6fe13fc42 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:34.204 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "update_usage" :: held 0.403s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:34.453 DEBUG nova.compute.manager [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:34.734 DEBUG oslo_concurrency.lockutils [req-d293069d-fb13-47cb-81d1-7873d6ef176a tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "790f2093-5329-49fd-a0c7-ab1fe4c523c9" released by "do_terminate_instance" :: held 13.484s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:34.941 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:35.111 DEBUG oslo_concurrency.lockutils [req-47267924-bdb6-4287-b402-ee88badbfeac tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "_locked_do_build_and_run_instance" :: held 53.148s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:35.320 DEBUG nova.virt.xenapi.vmops [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:28:35.358 DEBUG nova.virt.xenapi.vm_utils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:28:35.358 DEBUG nova.compute.manager [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:28:35.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:35.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:36.464 DEBUG nova.compute.manager [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:36.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:36.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:28:36.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:36.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:36.688 INFO nova.compute.manager [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] instance snapshotting 2015-08-07 17:28:36.694 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:28:36.711 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:36.712 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:37.014 INFO nova.compute.manager [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Rebuilding instance 2015-08-07 17:28:37.100 DEBUG nova.compute.manager [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:28:37.320 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.609s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:37.331 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent cb0c7a09-9e0a-43d0-81da-1741b6ef562a _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:37.343 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD cb0c7a09-9e0a-43d0-81da-1741b6ef562a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:37.352 INFO nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VM 2015-08-07 17:28:37.397 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:28:37.414 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 0fcf28ae-7bfd-4d8a-8f0a-0f2f6dce627b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:37.429 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent cb0c7a09-9e0a-43d0-81da-1741b6ef562a _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:37.447 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:37.488 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:28:37.802 DEBUG nova.compute.manager [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:27:31Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=26,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=a820417c-8cc3-4f15-b639-5d258d9e6d1b,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:27:34Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:28:38.097 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:38.098 DEBUG nova.objects.instance [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lazy-loading `numa_topology' on Instance uuid a820417c-8cc3-4f15-b639-5d258d9e6d1b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:38.206 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "compute_resources" released by "update_usage" :: held 0.109s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:38.550 DEBUG oslo_concurrency.lockutils [req-f881acd5-c2cc-44fd-a7c3-e619bff8e4f9 tempest-ServersTestJSON-2071665233 tempest-ServersTestJSON-1606226605] Lock "a820417c-8cc3-4f15-b639-5d258d9e6d1b" released by "do_terminate_instance" :: held 16.538s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:39.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:39.554 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:28:39.555 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:28:39.833 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:39.834 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:40.559 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.726s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:40.560 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.384s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:40.561 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:40.992 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:28:40.993 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:28:40.993 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:28:40.994 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:41.366 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:28:41.367 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:28:41.378 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.818s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:41.386 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent 640bde54-7586-4fbc-9366-51dba5b04944 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:41.387 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Parent 640bde54-7586-4fbc-9366-51dba5b04944 not yet in parent list ['cb0c7a09-9e0a-43d0-81da-1741b6ef562a', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:28:41.452 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:28:41.453 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.460s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:41.454 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:45.025 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:45.803 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:45.884 INFO nova.compute.manager [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Starting instance... 2015-08-07 17:28:46.147 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:46.148 DEBUG nova.compute.resource_tracker [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:28:46.157 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:28:46.158 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:28:46.158 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:28:46.159 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:28:46.159 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] disk limit not specified, defaulting to unlimited 2015-08-07 17:28:46.187 DEBUG nova.compute.resources.vcpu [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:28:46.189 DEBUG nova.compute.resources.vcpu [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:28:46.189 INFO nova.compute.claims [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Claim successful 2015-08-07 17:28:46.388 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:46.388 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:46.608 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" released by "instance_claim" :: held 0.461s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:46.856 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:46.962 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.574s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:46.967 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 3927ae0f-627f-4fa4-89d1-46266f19f9fe has parent cb0c7a09-9e0a-43d0-81da-1741b6ef562a _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:46.968 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Coalesce detected, because parent is: cb0c7a09-9e0a-43d0-81da-1741b6ef562a _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2118 2015-08-07 17:28:46.978 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" released by "update_usage" :: held 0.122s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:46.979 DEBUG nova.compute.utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:28:46.984 13318 DEBUG nova.compute.manager [-] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:28:46.985 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-7c6651a6-8cc4-4ac9-851c-637fafaf8705" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:28:46.989 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:46.990 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:28:47.214 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:28:47.245 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI 0fcf28ae-7bfd-4d8a-8f0a-0f2f6dce627b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:47.306 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI 4332092c-ded0-4752-9f07-d6231a91e1cc is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:28:47.661 DEBUG oslo_concurrency.lockutils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.672s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:47.670 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:28:47.683 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD 5cfe2f58-234b-4cf0-a848-2e06ef6f2d53 has parent cb0c7a09-9e0a-43d0-81da-1741b6ef562a _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:47.703 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:28:47.704 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:47.720 DEBUG nova.virt.xenapi.vm_utils [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VHD cb0c7a09-9e0a-43d0-81da-1741b6ef562a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:28:47.960 DEBUG nova.virt.xenapi.client.session [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:28:48.018 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:28:48.033 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:48.453 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:28:48.458 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 39.05 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:49.576 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Cloned VDI OpaqueRef:4cb16f13-df5c-4a3b-2c50-8ad1c9bb4328 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:28:49.873 13318 DEBUG nova.network.base_api [-] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:63:ff:eb', 'active': False, 'type': u'bridge', 'id': u'f8400f71-ee0f-4bd8-b048-9b5c2f5906d4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:28:49.905 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-7c6651a6-8cc4-4ac9-851c-637fafaf8705" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:28:49.906 13318 DEBUG nova.compute.manager [-] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:63:ff:eb', 'active': False, 'type': u'bridge', 'id': u'f8400f71-ee0f-4bd8-b048-9b5c2f5906d4', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:28:50.418 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.385s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:50.419 INFO nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Image creation data, cacheable: True, downloaded: False duration: 2.40 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:28:51.307 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:51.559 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:51.823 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:28:51.840 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:28:51.840 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:52.405 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:0d5ef13b-535a-aa3b-26dd-13faab3fa2a1, VDI OpaqueRef:4cb16f13-df5c-4a3b-2c50-8ad1c9bb4328 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:28:52.418 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:57642942-46f3-244a-9973-f49cd9d5bf2e for VM OpaqueRef:0d5ef13b-535a-aa3b-26dd-13faab3fa2a1, VDI OpaqueRef:4cb16f13-df5c-4a3b-2c50-8ad1c9bb4328. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:28:52.812 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:28:52.815 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:28:52.833 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:2d751a08-c2f0-0744-bf11-0bfb4215f173 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:28:52.834 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Plugging VBD OpaqueRef:2d751a08-c2f0-0744-bf11-0bfb4215f173 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:28:52.834 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:28:54.159 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.325s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:28:54.160 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Plugging VBD OpaqueRef:2d751a08-c2f0-0744-bf11-0bfb4215f173 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:28:54.164 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] VBD OpaqueRef:2d751a08-c2f0-0744-bf11-0bfb4215f173 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:28:54.251 WARNING nova.virt.configdrive [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:28:54.252 DEBUG nova.objects.instance [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lazy-loading `ec2_ids' on Instance uuid 7c6651a6-8cc4-4ac9-851c-637fafaf8705 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:28:54.291 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): genisoimage -o /tmp/tmpVBAPm0/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpq07bFX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:28:54.407 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "genisoimage -o /tmp/tmpVBAPm0/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpq07bFX" returned: 0 in 0.115s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:28:54.413 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVBAPm0/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:28:55.123 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:28:58.052 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:28:58.068 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:28:58.753 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:28:58.775 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:28:58.775 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:28:59.050 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:28:59.062 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:02.144 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Cloned VDI OpaqueRef:e4633006-a493-e26c-bedd-93fb0b893632 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:29:02.623 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVBAPm0/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 8.209s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:02.625 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:29:03.271 DEBUG oslo_concurrency.processutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.646s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:03.274 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Destroying VBD for VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:29:03.275 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:03.335 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.273s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:03.336 INFO nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Image creation data, cacheable: True, downloaded: False duration: 4.29 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:29:04.338 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:04.421 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.146s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:04.429 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Destroying VBD for VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:29:04.430 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:0d5ef13b-535a-aa3b-26dd-13faab3fa2a1, VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:04.440 DEBUG nova.virt.xenapi.vm_utils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:de6056a8-268b-6f6d-de8b-0b1fa0f22608 for VM OpaqueRef:0d5ef13b-535a-aa3b-26dd-13faab3fa2a1, VDI OpaqueRef:38a7cfbd-f40e-13bc-ec38-0e7fc3282f2f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:04.442 DEBUG nova.objects.instance [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lazy-loading `pci_devices' on Instance uuid 7c6651a6-8cc4-4ac9-851c-637fafaf8705 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:04.550 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:04.568 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:04.749 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:29:04.765 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:29:04.766 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:04.870 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:04.871 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:04.871 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:04.889 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "store_auto_disk_config" :: held 0.017s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:04.890 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Injecting hostname (tempest.common.compute-instance-1903460814) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:29:04.891 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:04.909 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "update_hostname" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:04.910 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:29:04.911 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:04.923 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:05.004 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Auto configuring disk, attempting to resize root disk... _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:721 2015-08-07 17:29:05.005 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Skipping auto_config_disk as destination size is 0GB _auto_configure_disk /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:960 2015-08-07 17:29:05.005 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:50852854-b7ff-b8c5-45ad-e52af66db69f, VDI OpaqueRef:e4633006-a493-e26c-bedd-93fb0b893632 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:05.013 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:6071a7a7-3ab9-9025-d907-eada0cb1eb31 for VM OpaqueRef:50852854-b7ff-b8c5-45ad-e52af66db69f, VDI OpaqueRef:e4633006-a493-e26c-bedd-93fb0b893632. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:05.158 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "update_nwinfo" :: held 0.247s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:05.159 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:05.421 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:29:05.434 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:29:05.444 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Created VIF OpaqueRef:f2f2849c-4868-40d6-c78b-760b05a166fb, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:29:05.445 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:05.474 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:29:05.484 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:05.499 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:2d597014-b1dd-493a-bb2d-ddae0b5269e6 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:05.499 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:2d597014-b1dd-493a-bb2d-ddae0b5269e6 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:29:05.500 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:05.732 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:29:07.733 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.233s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:07.734 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:2d597014-b1dd-493a-bb2d-ddae0b5269e6 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:29:07.738 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VBD OpaqueRef:2d597014-b1dd-493a-bb2d-ddae0b5269e6 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:29:07.856 WARNING nova.virt.configdrive [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:29:07.857 DEBUG nova.objects.instance [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `ec2_ids' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:07.896 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): genisoimage -o /tmp/tmpFXHa0L/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpHgBy2I execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:29:08.052 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "genisoimage -o /tmp/tmpFXHa0L/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpHgBy2I" returned: 0 in 0.156s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:08.058 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpFXHa0L/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:29:11.404 DEBUG nova.virt.xenapi.vmops [req-c0cfed15-976d-49b3-b6e6-e8048cd30998 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Finished snapshot and upload for VM, duration: 34.71 secs for image 4e3f1ef7-737d-4be3-9609-2f9f376419ec snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:29:14.254 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:14.255 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:14.256 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:14.259 INFO nova.compute.manager [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Terminating instance 2015-08-07 17:29:14.261 INFO nova.virt.xenapi.vmops [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Destroying VM 2015-08-07 17:29:14.273 DEBUG nova.virt.xenapi.vm_utils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:29:14.919 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:16.787 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpFXHa0L/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 8.728s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:16.817 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:29:17.509 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:29:17.832 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:18.684 DEBUG oslo_concurrency.processutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.868s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:18.691 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:29:18.693 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:19.441 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:29:19.442 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:29:19.443 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:19.450 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:19.451 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:19.791 DEBUG nova.virt.xenapi.vmops [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:20.036 DEBUG nova.compute.manager [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:29:20.156 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.464s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.165 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:29:20.166 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:50852854-b7ff-b8c5-45ad-e52af66db69f, VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:20.174 DEBUG nova.virt.xenapi.vm_utils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:5d79c330-17c1-8294-e816-b977b952badb for VM OpaqueRef:50852854-b7ff-b8c5-45ad-e52af66db69f, VDI OpaqueRef:b47eefaa-a33b-bf92-d6ee-95bbf2ed69d1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:20.176 DEBUG nova.objects.instance [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `pci_devices' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:20.289 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:20.432 DEBUG oslo_concurrency.lockutils [req-53303822-dea8-4356-8123-1123d7cc9c42 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "_locked_do_build_and_run_instance" :: held 34.629s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.527 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:20.528 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.528 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:20.536 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.537 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting hostname (tempest.common.compute-instance-738647160) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:29:20.538 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:20.546 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.547 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:29:20.548 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:20.733 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_nwinfo" :: held 0.186s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:20.734 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:20.952 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:29:20.963 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:29:20.971 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VIF OpaqueRef:2efbf307-36be-cfcd-64e1-eab95affaaea, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:29:20.972 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:21.179 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:29:22.352 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:22.354 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:22.354 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:22.356 INFO nova.compute.manager [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Terminating instance 2015-08-07 17:29:22.358 INFO nova.virt.xenapi.vmops [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Destroying VM 2015-08-07 17:29:22.367 DEBUG nova.virt.xenapi.vm_utils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:29:24.783 DEBUG nova.virt.xenapi.vmops [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:29:24.794 DEBUG nova.virt.xenapi.vm_utils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VDI 3927ae0f-627f-4fa4-89d1-46266f19f9fe is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:24.807 DEBUG nova.virt.xenapi.vm_utils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] VDI 442e9646-bc80-4e8b-a567-13cc23079969 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:25.023 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:25.958 DEBUG nova.virt.xenapi.vmops [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:29:25.976 DEBUG nova.virt.xenapi.vm_utils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:29:25.977 DEBUG nova.compute.manager [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:29:26.160 DEBUG nova.virt.xenapi.vmops [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:29:26.171 DEBUG nova.virt.xenapi.vm_utils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] VDI 20e597d4-1788-4b71-bdb6-90200f19975b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:26.186 DEBUG nova.virt.xenapi.vm_utils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] VDI ad433b69-7324-4795-8268-61b32d1183e5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:27.359 DEBUG nova.virt.xenapi.vmops [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:29:27.378 DEBUG nova.virt.xenapi.vm_utils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:29:27.379 DEBUG nova.compute.manager [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:29:27.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:27.598 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:27.605 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:27.606 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:29:27.606 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:29:27.684 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:29:27.685 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:29:27.686 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:29:27.687 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:28.089 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:29:28.120 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:29:28.121 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:29:28.122 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:28.140 DEBUG nova.compute.manager [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:27:30Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=25,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1cff7b75-3705-4f14-9e5a-546d1797c17f,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:27:33Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:29:28.414 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:28.416 DEBUG nova.objects.instance [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lazy-loading `numa_topology' on Instance uuid 1cff7b75-3705-4f14-9e5a-546d1797c17f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:28.574 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "compute_resources" released by "update_usage" :: held 0.160s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:29.037 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:29.038 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:29.039 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:29.069 DEBUG oslo_concurrency.lockutils [req-7d306f79-9992-4fe6-81fd-82bcc1911158 tempest-ImagesOneServerTestJSON-803357424 tempest-ImagesOneServerTestJSON-1410599508] Lock "1cff7b75-3705-4f14-9e5a-546d1797c17f" released by "do_terminate_instance" :: held 14.816s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:29.679 DEBUG nova.compute.manager [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:28:45Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=28,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=7c6651a6-8cc4-4ac9-851c-637fafaf8705,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:28:46Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:29:29.930 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:29:29.948 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:30.227 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:30.228 DEBUG nova.objects.instance [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lazy-loading `numa_topology' on Instance uuid 7c6651a6-8cc4-4ac9-851c-637fafaf8705 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:30.321 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "compute_resources" released by "update_usage" :: held 0.095s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:30.347 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:29:30.348 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:29:30.348 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:30.359 DEBUG oslo_concurrency.lockutils [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:30.360 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:30.681 DEBUG nova.virt.xenapi.vmops [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:30.792 DEBUG oslo_concurrency.lockutils [req-02432b64-24e2-442a-bc20-56f8549bd7ca tempest-DeleteServersAdminTestJSON-2048663257 tempest-DeleteServersAdminTestJSON-2144420331] Lock "7c6651a6-8cc4-4ac9-851c-637fafaf8705" released by "do_terminate_instance" :: held 8.440s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:30.925 DEBUG nova.compute.manager [req-f65183f1-f89d-40cc-8186-2cbc8ce07f97 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:29:32.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:32.514 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:33.317 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:33.381 INFO nova.compute.manager [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Starting instance... 2015-08-07 17:29:33.430 INFO nova.compute.manager [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Rebuilding instance 2015-08-07 17:29:33.673 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:33.674 DEBUG nova.compute.resource_tracker [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:29:33.681 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:29:33.682 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:29:33.683 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:29:33.683 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:29:33.683 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] disk limit not specified, defaulting to unlimited 2015-08-07 17:29:33.730 DEBUG nova.compute.resources.vcpu [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:29:33.731 DEBUG nova.compute.resources.vcpu [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:29:33.732 INFO nova.compute.claims [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Claim successful 2015-08-07 17:29:34.008 DEBUG nova.compute.manager [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:29:34.258 INFO nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VM 2015-08-07 17:29:34.294 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:29:34.307 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" released by "instance_claim" :: held 0.634s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:35.036 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:35.106 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:35.245 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" released by "update_usage" :: held 0.139s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:35.246 DEBUG nova.compute.utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:29:35.250 13318 DEBUG nova.compute.manager [-] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:29:35.251 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-c87575af-66d8-459c-a310-79d46c0ace86" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:29:35.879 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:29:35.895 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:29:35.896 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:36.183 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:29:36.208 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:36.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:36.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:36.668 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:36.755 INFO nova.compute.manager [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Starting instance... 2015-08-07 17:29:37.157 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:37.158 DEBUG nova.compute.resource_tracker [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:29:37.181 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:29:37.181 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:29:37.182 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:29:37.182 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:29:37.183 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] disk limit not specified, defaulting to unlimited 2015-08-07 17:29:37.227 DEBUG nova.compute.resources.vcpu [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:29:37.228 DEBUG nova.compute.resources.vcpu [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:29:37.229 INFO nova.compute.claims [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Claim successful 2015-08-07 17:29:37.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:37.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:29:37.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:37.799 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "instance_claim" :: held 0.642s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:38.029 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Cloned VDI OpaqueRef:0dce079c-0348-25ac-d130-787a94880eaa from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:29:38.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:38.633 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:38.723 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:39.035 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.312s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:39.036 DEBUG nova.compute.utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:29:39.040 13318 DEBUG nova.compute.manager [-] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:29:39.041 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-8f22b72a-a408-4796-8637-4dedc84a367a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:29:39.422 13318 DEBUG nova.network.base_api [-] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ca:fa:7e', 'active': False, 'type': u'bridge', 'id': u'36590f3d-3c8a-469d-97b3-e91130703d92', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:29:39.488 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-c87575af-66d8-459c-a310-79d46c0ace86" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:29:39.620 13318 DEBUG nova.compute.manager [-] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ca:fa:7e', 'active': False, 'type': u'bridge', 'id': u'36590f3d-3c8a-469d-97b3-e91130703d92', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:29:39.711 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:29:39.748 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI 6c7ca38a-84b1-4005-afd9-b74f105d8e1b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:39.812 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI c208de1b-e7d1-4009-aeac-f4fe472c37bd is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:29:40.131 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:29:40.158 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:29:40.159 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:40.181 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.973s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:40.182 INFO nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Image creation data, cacheable: True, downloaded: False duration: 4.00 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:29:40.452 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:29:40.469 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:40.937 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:29:40.953 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:29:41.299 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:41.621 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:41.629 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:41.658 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:29:41.659 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:29:41.753 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:29:41.781 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:29:41.782 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:41.947 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:29:41.962 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:29:41.963 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:41.985 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:41.986 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:29:42.062 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:29:42.318 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:00ee5929-3890-2beb-178d-1cafac99967f, VDI OpaqueRef:0dce079c-0348-25ac-d130-787a94880eaa ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:42.332 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:4bff4f2f-0cfc-d876-7ee1-db6a39f8eaec for VM OpaqueRef:00ee5929-3890-2beb-178d-1cafac99967f, VDI OpaqueRef:0dce079c-0348-25ac-d130-787a94880eaa. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:42.478 13318 DEBUG nova.network.base_api [-] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6b:dc:61', 'active': False, 'type': u'bridge', 'id': u'5ca5375c-0fad-45ff-8eb7-5fbb9bf8ba34', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:29:42.511 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-8f22b72a-a408-4796-8637-4dedc84a367a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:29:42.511 13318 DEBUG nova.compute.manager [-] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6b:dc:61', 'active': False, 'type': u'bridge', 'id': u'5ca5375c-0fad-45ff-8eb7-5fbb9bf8ba34', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:29:42.599 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.614s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:42.833 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:29:42.834 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:29:42.834 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:29:42.835 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:43.121 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:29:43.121 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:29:43.208 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:29:43.208 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.374s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:43.209 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:44.942 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:48.101 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:29:48.102 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 40.42 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:53.563 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Cloned VDI OpaqueRef:8d5b5439-e2f5-b27c-2777-00a6575f547f from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:29:54.256 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 13.787s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:54.257 INFO nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Image creation data, cacheable: True, downloaded: False duration: 13.80 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:29:54.257 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 12.176s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:54.918 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:29:55.491 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Cloned VDI OpaqueRef:849cd50a-608f-511b-fecc-b50725e4abeb from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:29:55.608 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:55.899 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:56.134 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:29:56.147 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:29:56.148 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:56.244 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.987s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:56.245 INFO nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Image creation data, cacheable: True, downloaded: False duration: 14.18 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:29:56.385 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:2b0db404-c434-9191-1bbc-d79a19320b28, VDI OpaqueRef:8d5b5439-e2f5-b27c-2777-00a6575f547f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:56.396 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:42b9cffa-343e-3dd5-0027-6af388ec03c3 for VM OpaqueRef:2b0db404-c434-9191-1bbc-d79a19320b28, VDI OpaqueRef:8d5b5439-e2f5-b27c-2777-00a6575f547f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:56.798 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:29:56.802 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:56.813 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:b1b0ad1a-76a3-ac4f-ad8e-8aa050afbc3f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:56.814 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:b1b0ad1a-76a3-ac4f-ad8e-8aa050afbc3f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:29:56.815 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:57.006 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:57.220 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:57.437 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:29:57.451 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:29:57.451 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:29:57.936 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:5ecdef34-da4a-36cc-1f29-bf95b8447516, VDI OpaqueRef:849cd50a-608f-511b-fecc-b50725e4abeb ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:57.945 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:60f82a30-6632-d159-a2e8-b41028f6bef5 for VM OpaqueRef:5ecdef34-da4a-36cc-1f29-bf95b8447516, VDI OpaqueRef:849cd50a-608f-511b-fecc-b50725e4abeb. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:58.251 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.436s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:29:58.252 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:b1b0ad1a-76a3-ac4f-ad8e-8aa050afbc3f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:29:58.256 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VBD OpaqueRef:b1b0ad1a-76a3-ac4f-ad8e-8aa050afbc3f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:29:58.344 WARNING nova.virt.configdrive [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:29:58.345 DEBUG nova.objects.instance [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `ec2_ids' on Instance uuid 8f22b72a-a408-4796-8637-4dedc84a367a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:29:58.375 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:29:58.384 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): genisoimage -o /tmp/tmpR9quJo/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpmsr4S0 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:29:58.465 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:29:58.482 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:ed59eccd-5c02-a782-257a-e2d0f4e5cb9b for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:29:58.482 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:ed59eccd-5c02-a782-257a-e2d0f4e5cb9b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:29:58.484 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:29:58.486 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "genisoimage -o /tmp/tmpR9quJo/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpmsr4S0" returned: 0 in 0.102s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:29:58.490 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpR9quJo/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:00.280 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.796s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:00.283 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:ed59eccd-5c02-a782-257a-e2d0f4e5cb9b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:30:00.287 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VBD OpaqueRef:ed59eccd-5c02-a782-257a-e2d0f4e5cb9b plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:30:00.401 WARNING nova.virt.configdrive [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:30:00.403 DEBUG nova.objects.instance [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `ec2_ids' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:00.446 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): genisoimage -o /tmp/tmpBVEH3M/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp093hKX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:00.548 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "genisoimage -o /tmp/tmpBVEH3M/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp093hKX" returned: 0 in 0.102s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:00.560 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpBVEH3M/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:02.940 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:30:02.946 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:02.961 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:35811962-53ce-c79a-fa2c-5358620bac8a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:02.963 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Plugging VBD OpaqueRef:35811962-53ce-c79a-fa2c-5358620bac8a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:30:03.081 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:04.951 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:05.749 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.668s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:05.750 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Plugging VBD OpaqueRef:35811962-53ce-c79a-fa2c-5358620bac8a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:30:05.754 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] VBD OpaqueRef:35811962-53ce-c79a-fa2c-5358620bac8a plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:30:05.863 WARNING nova.virt.configdrive [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:30:05.864 DEBUG nova.objects.instance [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lazy-loading `ec2_ids' on Instance uuid c87575af-66d8-459c-a310-79d46c0ace86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:05.910 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): genisoimage -o /tmp/tmp0Y6Xnz/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp8bmlOX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:06.041 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "genisoimage -o /tmp/tmp0Y6Xnz/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp8bmlOX" returned: 0 in 0.131s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:06.048 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp0Y6Xnz/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:11.477 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpR9quJo/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 12.987s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:11.482 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:12.559 DEBUG oslo_concurrency.processutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.076s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:12.562 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:30:12.563 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:14.656 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.092s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:14.668 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:30:14.669 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:2b0db404-c434-9191-1bbc-d79a19320b28, VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:14.682 DEBUG nova.virt.xenapi.vm_utils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:7c3c059f-1a53-0e91-fac3-6382f6e48980 for VM OpaqueRef:2b0db404-c434-9191-1bbc-d79a19320b28, VDI OpaqueRef:6d7afc04-b931-9b6e-e232-2c83032518d1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:14.683 DEBUG nova.objects.instance [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `pci_devices' on Instance uuid 8f22b72a-a408-4796-8637-4dedc84a367a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:14.815 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:14.939 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:15.193 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:15.194 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:15.194 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:15.206 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:15.207 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Injecting hostname (tempest.common.compute-instance-429943474) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:30:15.208 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:15.218 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:15.219 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:30:15.220 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:15.423 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" released by "update_nwinfo" :: held 0.203s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:15.424 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:15.745 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:30:15.754 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:30:15.762 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Created VIF OpaqueRef:8808e896-bb17-d415-f9fe-96d554a7e73c, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:30:15.763 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:16.090 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:30:17.798 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpBVEH3M/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 17.239s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:17.800 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:18.248 DEBUG oslo_concurrency.processutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.448s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:18.253 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:30:18.255 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:20.726 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp0Y6Xnz/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 14.678s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:20.728 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:30:20.837 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.582s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:20.849 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:30:20.850 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:5ecdef34-da4a-36cc-1f29-bf95b8447516, VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:20.861 DEBUG nova.virt.xenapi.vm_utils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:c7349e42-08ae-8aaf-bb31-7ed74e52fdaf for VM OpaqueRef:5ecdef34-da4a-36cc-1f29-bf95b8447516, VDI OpaqueRef:781793ba-5879-7738-c338-3cc127ed641d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:20.863 DEBUG nova.objects.instance [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `pci_devices' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:20.991 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:21.157 DEBUG oslo_concurrency.processutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.429s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:30:21.158 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Destroying VBD for VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:30:21.164 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:21.207 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:21.207 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:21.208 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:21.221 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_auto_disk_config" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:21.222 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting hostname (tempest.common.compute-instance-738647160) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:30:21.223 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:21.242 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.019s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:21.243 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:30:21.244 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:21.570 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_nwinfo" :: held 0.326s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:21.571 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:21.779 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:30:21.790 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:30:21.803 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VIF OpaqueRef:774f43e6-7ae1-ed00-fd1c-43965db0bc8b, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:30:21.804 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:22.064 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:30:22.270 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:22.279 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Destroying VBD for VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:30:22.279 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Creating disk-type VBD for VM OpaqueRef:00ee5929-3890-2beb-178d-1cafac99967f, VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:22.290 DEBUG nova.virt.xenapi.vm_utils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Created VBD OpaqueRef:c8fc7aa6-4b52-e85e-0c1a-e162063926bd for VM OpaqueRef:00ee5929-3890-2beb-178d-1cafac99967f, VDI OpaqueRef:8e8706f7-0804-b867-ca8a-07b185b0cf41. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:22.291 DEBUG nova.objects.instance [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lazy-loading `pci_devices' on Instance uuid c87575af-66d8-459c-a310-79d46c0ace86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:22.612 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:22.864 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:22.865 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:22.866 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:22.874 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:22.875 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Injecting hostname (tempest.common.compute-instance-374643929) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:30:22.875 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:22.889 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:22.890 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:30:22.891 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:23.203 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" released by "update_nwinfo" :: held 0.312s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:23.204 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:23.465 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:30:23.473 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:30:23.490 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Created VIF OpaqueRef:e5443212-350e-02c2-9a74-e6f17fdda485, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:30:23.491 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:23.756 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:30:24.929 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:27.240 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:30:27.281 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:27.661 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:30:27.662 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:30:27.663 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:27.669 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-8f22b72a-a408-4796-8637-4dedc84a367a" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:27.670 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:27.931 DEBUG nova.virt.xenapi.vmops [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:28.182 DEBUG nova.compute.manager [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:30:28.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:28.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:30:28.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:30:28.582 DEBUG oslo_concurrency.lockutils [req-40b9b2e7-2859-4bc3-8922-3efdd0516513 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a" released by "_locked_do_build_and_run_instance" :: held 51.915s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:28.598 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:30:28.607 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:30:28.608 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:28.936 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:30:28.969 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:30:28.969 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:30:28.970 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:29.970 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:29.971 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.55 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:30.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:30.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:31.075 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:30:31.110 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:31.435 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:30:31.436 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:30:31.437 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:31.442 DEBUG oslo_concurrency.lockutils [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:31.442 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:31.690 DEBUG nova.virt.xenapi.vmops [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:32.012 DEBUG nova.compute.manager [req-0d9a51c0-e53b-45f5-b9d9-cf09a2fb2073 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:30:32.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:32.514 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:32.646 DEBUG nova.compute.manager [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:30:32.897 INFO nova.compute.manager [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] instance snapshotting 2015-08-07 17:30:32.904 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:30:32.988 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:32.988 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:30:33.039 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:30:33.156 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:33.697 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:30:33.699 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:30:33.699 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:33.707 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "xenstore-c87575af-66d8-459c-a310-79d46c0ace86" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:33.707 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:34.289 DEBUG nova.virt.xenapi.vmops [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:34.495 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.508s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:34.507 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD cb7e93a6-6ca3-4d68-9430-7a44dfe1896a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:34.564 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD 416fc4db-2f34-4e64-aded-e00eaaf8f6e4 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:34.570 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD cb7e93a6-6ca3-4d68-9430-7a44dfe1896a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:34.583 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD d5067377-6fa5-411f-906d-a932f89ffdf9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:34.596 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:34.620 DEBUG nova.compute.manager [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:30:34.625 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:30:34.941 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:35.432 DEBUG nova.compute.manager [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:30:35.494 DEBUG oslo_concurrency.lockutils [req-1d62a709-9d20-42a8-abb5-d359d1683b09 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86" released by "_locked_do_build_and_run_instance" :: held 62.177s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:35.708 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:35.709 DEBUG nova.compute.resource_tracker [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:30:35.720 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:30:35.722 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:30:35.722 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:30:35.722 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:30:35.723 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] disk limit not specified, defaulting to unlimited 2015-08-07 17:30:35.764 DEBUG nova.compute.resources.vcpu [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:30:35.765 DEBUG nova.compute.resources.vcpu [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:30:35.765 INFO nova.compute.claims [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Claim successful 2015-08-07 17:30:35.845 INFO nova.compute.resource_tracker [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Updating from migration 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 2015-08-07 17:30:36.059 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "resize_claim" :: held 0.351s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:36.060 INFO nova.compute.manager [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating 2015-08-07 17:30:36.175 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:30:36.390 DEBUG nova.network.base_api [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:30:36.422 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:30:36.766 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:37.049 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:37.050 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:37.050 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:37.052 INFO nova.compute.manager [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Terminating instance 2015-08-07 17:30:37.054 INFO nova.virt.xenapi.vmops [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Destroying VM 2015-08-07 17:30:37.069 DEBUG nova.virt.xenapi.vm_utils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:30:37.094 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:30:37.141 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:37.142 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:30:37.408 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:30:37.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:37.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:30:37.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:38.035 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.894s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:38.036 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.614s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:38.036 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:30:38.058 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 416fc4db-2f34-4e64-aded-e00eaaf8f6e4 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.093 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD e5b0de3c-26ec-4373-89d7-57ac32cfb5f4 has parent ad9b424d-f2de-4d85-be91-89d5cecdf115 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.099 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD ad9b424d-f2de-4d85-be91-89d5cecdf115 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.121 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 416fc4db-2f34-4e64-aded-e00eaaf8f6e4 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.128 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD cb7e93a6-6ca3-4d68-9430-7a44dfe1896a has parent ad9b424d-f2de-4d85-be91-89d5cecdf115 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.133 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD d5067377-6fa5-411f-906d-a932f89ffdf9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.147 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.192 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:30:38.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:38.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:38.702 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.666s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:38.726 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD e5b0de3c-26ec-4373-89d7-57ac32cfb5f4 has parent ad9b424d-f2de-4d85-be91-89d5cecdf115 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:38.756 DEBUG nova.virt.xenapi.vm_utils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD ad9b424d-f2de-4d85-be91-89d5cecdf115 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:30:39.100 DEBUG nova.virt.xenapi.client.session [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:30:39.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:39.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:39.911 DEBUG nova.virt.xenapi.client.session [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Got exception: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [7bc054ef-a379-4259-8369-e9a82d47d1ed] to glance host [192.168.33.1:9292]'] _unwrap_plugin_exceptions /opt/stack/new/nova/nova/virt/xenapi/client/session.py:293 2015-08-07 17:30:40.509 DEBUG nova.compute.manager [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Cleaning up image 7bc054ef-a379-4259-8369-e9a82d47d1ed decorated_function /opt/stack/new/nova/nova/compute/manager.py:406 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Traceback (most recent call last): 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/compute/manager.py", line 402, in decorated_function 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] *args, **kwargs) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/compute/manager.py", line 2904, in snapshot_instance 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] task_states.IMAGE_SNAPSHOT) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/compute/manager.py", line 2934, in _snapshot_instance 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] update_task_state) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/driver.py", line 218, in snapshot 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] self._vmops.snapshot(context, instance, image_id, update_task_state) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/vmops.py", line 887, in snapshot 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] vdi_uuids, 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 92, in upload_image 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] 'upload_vhd', params) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 54, in _call_glance_plugin 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] 'glance', fn, CONF.glance.num_retries, pick_glance, cb, **params) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 246, in call_plugin_serialized_with_retry 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] return self.call_plugin_serialized(plugin, fn, *args, **kwargs) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 221, in call_plugin_serialized 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] rv = self.call_plugin(plugin, fn, params) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 217, in call_plugin 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] self.host_ref, plugin, fn, args) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 291, in _unwrap_plugin_exceptions 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] return func(*args, **kwargs) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 229, in __call__ 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] return self.__send(self.__name, args) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 133, in xenapi_request 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] result = _parse_result(getattr(self, methodname)(*full_params)) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 203, in _parse_result 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] raise Failure(result['ErrorDescription']) 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Failure: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [7bc054ef-a379-4259-8369-e9a82d47d1ed] to glance host [192.168.33.1:9292]'] 2015-08-07 17:30:40.509 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] 2015-08-07 17:30:40.580 ERROR nova.compute.manager [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Error while trying to clean up image 7bc054ef-a379-4259-8369-e9a82d47d1ed 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Traceback (most recent call last): 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/compute/manager.py", line 408, in decorated_function 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] self.image_api.delete(context, image_id) 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/image/api.py", line 141, in delete 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] return session.delete(context, image_id) 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] File "/opt/stack/new/nova/nova/image/glance.py", line 424, in delete 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] raise exception.ImageNotFound(image_id=image_id) 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] ImageNotFound: Image 7bc054ef-a379-4259-8369-e9a82d47d1ed could not be found. 2015-08-07 17:30:40.580 13318 ERROR nova.compute.manager [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] 2015-08-07 17:30:40.680 DEBUG nova.virt.xenapi.vmops [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:30:40.690 DEBUG nova.virt.xenapi.vm_utils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] VDI d5067377-6fa5-411f-906d-a932f89ffdf9 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:30:40.704 DEBUG nova.virt.xenapi.vm_utils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] VDI 5eb4430f-416a-4643-821a-4f59365a4466 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:30:40.852 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:40.950 DEBUG oslo_concurrency.lockutils [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.099s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:40.956 ERROR oslo_messaging.rpc.dispatcher [req-b1ec8e61-03e2-4663-8b50-f7ab2f665297 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Exception during message handling: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [7bc054ef-a379-4259-8369-e9a82d47d1ed] to glance host [192.168.33.1:9292]'] 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last): 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher executor_callback)) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher executor_callback) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 89, in wrapped 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher payload) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 72, in wrapped 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return f(self, context, *args, **kw) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 336, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher LOG.warning(msg, e, instance_uuid=instance_uuid) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 307, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 364, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher kwargs['instance'], e, sys.exc_info()) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 352, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 412, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher instance=instance) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 402, in decorated_function 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher *args, **kwargs) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 2904, in snapshot_instance 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher task_states.IMAGE_SNAPSHOT) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 2934, in _snapshot_instance 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher update_task_state) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/driver.py", line 218, in snapshot 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher self._vmops.snapshot(context, instance, image_id, update_task_state) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/vmops.py", line 887, in snapshot 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher vdi_uuids, 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 92, in upload_image 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher 'upload_vhd', params) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 54, in _call_glance_plugin 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher 'glance', fn, CONF.glance.num_retries, pick_glance, cb, **params) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 246, in call_plugin_serialized_with_retry 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return self.call_plugin_serialized(plugin, fn, *args, **kwargs) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 221, in call_plugin_serialized 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher rv = self.call_plugin(plugin, fn, params) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 217, in call_plugin 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher self.host_ref, plugin, fn, args) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 291, in _unwrap_plugin_exceptions 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return func(*args, **kwargs) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 229, in __call__ 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher return self.__send(self.__name, args) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 133, in xenapi_request 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher result = _parse_result(getattr(self, methodname)(*full_params)) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 203, in _parse_result 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher raise Failure(result['ErrorDescription']) 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher Failure: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [7bc054ef-a379-4259-8369-e9a82d47d1ed] to glance host [192.168.33.1:9292]'] 2015-08-07 17:30:40.956 13318 ERROR oslo_messaging.rpc.dispatcher 2015-08-07 17:30:41.587 DEBUG nova.virt.xenapi.vmops [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:30:41.604 DEBUG nova.virt.xenapi.vm_utils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:30:41.605 DEBUG nova.compute.manager [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:30:41.938 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:41.939 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:41.940 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:41.942 INFO nova.compute.manager [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Terminating instance 2015-08-07 17:30:41.944 INFO nova.virt.xenapi.vmops [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Destroying VM 2015-08-07 17:30:41.965 DEBUG nova.virt.xenapi.vm_utils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:30:43.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:43.552 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:30:43.552 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:30:43.702 DEBUG nova.compute.manager [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] [instance: c87575af-66d8-459c-a310-79d46c0ace86] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:29:32Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=29,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=c87575af-66d8-459c-a310-79d46c0ace86,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:29:35Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:30:43.834 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:43.834 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:30:43.910 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:43.911 DEBUG nova.objects.instance [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lazy-loading `numa_topology' on Instance uuid c87575af-66d8-459c-a310-79d46c0ace86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:44.015 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "compute_resources" released by "update_usage" :: held 0.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:44.460 DEBUG oslo_concurrency.lockutils [req-46597f2b-e95c-41e3-b255-e14b8c0d4716 tempest-DeleteServersAdminTestJSON-513154376 tempest-DeleteServersAdminTestJSON-120490888] Lock "c87575af-66d8-459c-a310-79d46c0ace86" released by "do_terminate_instance" :: held 7.412s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:44.528 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.695s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:44.862 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:30:44.862 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:30:44.863 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:30:44.864 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:44.980 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:45.272 DEBUG nova.virt.xenapi.vmops [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:30:45.289 DEBUG nova.virt.xenapi.vm_utils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI 2ca886d4-79ce-42b8-81ac-87e6f29b62a0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:30:45.308 DEBUG nova.virt.xenapi.vm_utils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI cb7e93a6-6ca3-4d68-9430-7a44dfe1896a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:30:45.368 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 2015-08-07 17:30:45.371 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `new_flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:45.609 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:30:45.610 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=784MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:30:45.691 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:30:45.692 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.828s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:45.692 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:45.693 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:30:45.937 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 10 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:30:45.939 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c87575af-66d8-459c-a310-79d46c0ace86] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:46.182 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 7c6651a6-8cc4-4ac9-851c-637fafaf8705] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:46.383 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: a820417c-8cc3-4f15-b639-5d258d9e6d1b] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:46.805 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1cff7b75-3705-4f14-9e5a-546d1797c17f] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:46.815 DEBUG nova.virt.xenapi.vmops [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:30:46.837 DEBUG nova.virt.xenapi.vm_utils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:30:46.851 DEBUG nova.compute.manager [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:30:47.033 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c5d13a83-4e8d-4d99-9630-b5219ac62190] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:47.233 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 790f2093-5329-49fd-a0c7-ab1fe4c523c9] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:47.457 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 18c3ad3b-8cd4-4e41-b278-93a63e82aac4] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:47.704 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1da4a346-da42-46e7-81c1-b0085c1ca90a] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:47.966 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 448f07ac-11c1-4844-84f7-c887efb5826a] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:48.228 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2d94b230-ee5f-44bb-9ce8-17e52b082de7] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:30:48.489 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:48.998 DEBUG nova.compute.manager [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:29:36Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=30,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=8f22b72a-a408-4796-8637-4dedc84a367a,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:29:39Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:30:49.231 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:49.232 DEBUG nova.objects.instance [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `numa_topology' on Instance uuid 8f22b72a-a408-4796-8637-4dedc84a367a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:30:49.321 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.090s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:49.659 DEBUG oslo_concurrency.lockutils [req-1275d071-a187-465d-8e31-0e89d853ecb9 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "8f22b72a-a408-4796-8637-4dedc84a367a" released by "do_terminate_instance" :: held 7.721s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:51.827 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:51.910 INFO nova.compute.manager [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Starting instance... 2015-08-07 17:30:52.145 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:52.181 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:52.182 DEBUG nova.compute.resource_tracker [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:30:52.190 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:30:52.191 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Total memory: 8187 MB, used: 715.00 MB 2015-08-07 17:30:52.191 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] memory limit: 12280.50 MB, free: 11565.50 MB 2015-08-07 17:30:52.192 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:30:52.192 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] disk limit not specified, defaulting to unlimited 2015-08-07 17:30:52.212 INFO nova.compute.manager [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Starting instance... 2015-08-07 17:30:52.234 DEBUG nova.compute.resources.vcpu [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:30:52.235 DEBUG nova.compute.resources.vcpu [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:30:52.235 INFO nova.compute.claims [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Claim successful 2015-08-07 17:30:52.591 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "instance_claim" :: held 0.410s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:52.599 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.138s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:52.599 DEBUG nova.compute.resource_tracker [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:30:52.608 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:30:52.609 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:30:52.609 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:30:52.609 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:30:52.611 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] disk limit not specified, defaulting to unlimited 2015-08-07 17:30:52.634 DEBUG nova.compute.resources.vcpu [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:30:52.635 DEBUG nova.compute.resources.vcpu [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:30:52.638 INFO nova.compute.claims [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Claim successful 2015-08-07 17:30:52.986 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.387s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:52.993 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.163s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:53.086 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.093s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:53.087 DEBUG nova.compute.utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:30:53.092 13318 DEBUG nova.compute.manager [-] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:30:53.093 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:30:53.257 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:53.364 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.107s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:53.366 DEBUG nova.compute.utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:30:53.370 13318 DEBUG nova.compute.manager [-] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:30:53.371 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-d28988ae-6848-475f-b43e-bc63166c3c1e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:30:53.488 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:30:53.489 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 35.03 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:53.793 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:30:53.823 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:30:53.824 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:54.076 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:30:54.116 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:30:54.117 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:54.131 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:30:54.145 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:54.390 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:30:54.927 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:30:55.416 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Cloned VDI OpaqueRef:ae25a0fa-39a4-f75a-2701-9bb196574031 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:30:56.204 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.058s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:56.205 INFO nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Image creation data, cacheable: True, downloaded: False duration: 2.07 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:30:56.206 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 1.800s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:56.319 13318 DEBUG nova.network.base_api [-] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7c:73:1d', 'active': False, 'type': u'bridge', 'id': u'e05951b7-8681-4381-9c7e-59c4555d5ef5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:30:56.351 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:30:56.352 13318 DEBUG nova.compute.manager [-] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7c:73:1d', 'active': False, 'type': u'bridge', 'id': u'e05951b7-8681-4381-9c7e-59c4555d5ef5', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:30:57.009 13318 DEBUG nova.network.base_api [-] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:bd:73:0c', 'active': False, 'type': u'bridge', 'id': u'ffc5efc9-f0e3-42ef-8a41-b8489447d370', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:30:57.038 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-d28988ae-6848-475f-b43e-bc63166c3c1e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:30:57.038 13318 DEBUG nova.compute.manager [-] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:bd:73:0c', 'active': False, 'type': u'bridge', 'id': u'ffc5efc9-f0e3-42ef-8a41-b8489447d370', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:30:57.816 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:292ab048-f6d3-f670-7868-3490d78d3cd0 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:30:57.999 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:58.270 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:58.494 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:30:58.508 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:30:58.509 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:58.544 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.338s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:30:58.545 INFO nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 4.15 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:30:58.744 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:102d80fb-e858-e7cb-15ef-8140311f8cd2, VDI OpaqueRef:ae25a0fa-39a4-f75a-2701-9bb196574031 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:58.755 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:f4a73e33-c691-1e96-5cd7-cef006363b58 for VM OpaqueRef:102d80fb-e858-e7cb-15ef-8140311f8cd2, VDI OpaqueRef:ae25a0fa-39a4-f75a-2701-9bb196574031. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:59.157 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:30:59.161 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:30:59.176 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:323db0a2-7fea-799e-094d-0a239ed963dc for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:30:59.176 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:323db0a2-7fea-799e-094d-0a239ed963dc ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:30:59.177 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:30:59.335 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:59.627 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:30:59.841 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:30:59.858 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:30:59.859 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:00.120 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:7a631735-f2b8-e994-ba05-c7a9d0e48afc, VDI OpaqueRef:292ab048-f6d3-f670-7868-3490d78d3cd0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:00.130 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:b78fd9d0-026f-3fec-55e8-a8a6bcd25ebc for VM OpaqueRef:7a631735-f2b8-e994-ba05-c7a9d0e48afc, VDI OpaqueRef:292ab048-f6d3-f670-7868-3490d78d3cd0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:00.540 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:31:00.544 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:00.556 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:664fceb7-063f-7b78-5d24-4c7c5f6634b1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:00.557 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:664fceb7-063f-7b78-5d24-4c7c5f6634b1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:31:00.579 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.402s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:00.580 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:323db0a2-7fea-799e-094d-0a239ed963dc done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:31:00.581 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.023s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:00.584 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VBD OpaqueRef:323db0a2-7fea-799e-094d-0a239ed963dc plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:31:00.677 WARNING nova.virt.configdrive [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:31:00.678 DEBUG nova.objects.instance [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `ec2_ids' on Instance uuid e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:00.722 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): genisoimage -o /tmp/tmpRnPcuN/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpdraFff execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:00.833 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "genisoimage -o /tmp/tmpRnPcuN/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpdraFff" returned: 0 in 0.111s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:00.839 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpRnPcuN/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:03.048 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.468s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:03.051 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:664fceb7-063f-7b78-5d24-4c7c5f6634b1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:31:03.056 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:664fceb7-063f-7b78-5d24-4c7c5f6634b1 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:31:03.160 WARNING nova.virt.configdrive [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:31:03.161 DEBUG nova.objects.instance [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid d28988ae-6848-475f-b43e-bc63166c3c1e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:03.203 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmpG7_bmK/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpUuURb4 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:03.332 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmpG7_bmK/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpUuURb4" returned: 0 in 0.128s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:03.338 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpG7_bmK/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:05.184 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:09.463 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:31:09.474 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:09.475 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:31:10.281 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.807s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:10.291 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 29486dfb-6f0b-49e0-a82e-b9237fae4c51 has parent 1b633f24-6022-4170-b119-2c4526160faa _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:10.302 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 1b633f24-6022-4170-b119-2c4526160faa has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:10.312 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:10.650 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating VHD '1b633f24-6022-4170-b119-2c4526160faa' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:31:13.335 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpRnPcuN/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 12.496s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:13.337 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:14.567 DEBUG oslo_concurrency.processutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.230s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:14.573 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:31:14.575 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:14.832 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:31:15.175 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:17.359 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpG7_bmK/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 14.022s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:17.361 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:17.835 DEBUG oslo_concurrency.processutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.474s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:17.840 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:31:17.921 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:31:17.922 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:18.226 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:31:18.243 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:31:18.924 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 4.349s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:18.926 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 1.085s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:18.950 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:31:18.953 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:102d80fb-e858-e7cb-15ef-8140311f8cd2, VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:18.964 DEBUG nova.virt.xenapi.vm_utils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:e94be5e3-1c56-f05f-dabf-7fd474a39650 for VM OpaqueRef:102d80fb-e858-e7cb-15ef-8140311f8cd2, VDI OpaqueRef:4cb5ead0-c168-08ed-fa8f-fd7877e1f25b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:18.965 DEBUG nova.objects.instance [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `pci_devices' on Instance uuid e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:19.094 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:19.392 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:19.393 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:19.395 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "store_auto_disk_config" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:19.405 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:19.406 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Injecting hostname (tempest.common.compute-instance-233266819) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:31:19.406 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:19.414 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:19.415 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:31:19.416 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:19.658 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "update_nwinfo" :: held 0.242s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:19.659 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:19.901 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.975s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:19.912 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:31:19.913 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:7a631735-f2b8-e994-ba05-c7a9d0e48afc, VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:19.924 DEBUG nova.virt.xenapi.vm_utils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:11f495ed-f9b2-aebf-0dba-5ecf8c1181d9 for VM OpaqueRef:7a631735-f2b8-e994-ba05-c7a9d0e48afc, VDI OpaqueRef:eeb3ca08-94f3-f5d7-aec1-7b6d89183d4d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:19.924 DEBUG nova.objects.instance [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid d28988ae-6848-475f-b43e-bc63166c3c1e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:20.060 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:20.142 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:31:20.150 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:31:20.160 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Created VIF OpaqueRef:3cb997fd-9c1c-d3e3-efd0-0e7591a14c38, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:31:20.161 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:20.316 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:20.317 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:20.317 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:20.332 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:20.333 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Injecting hostname (tempest.common.compute-instance-1666382429) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:31:20.333 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:20.344 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:20.345 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:31:20.346 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:20.416 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:31:20.569 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" released by "update_nwinfo" :: held 0.223s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:20.570 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:20.857 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:31:20.873 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:31:20.890 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Created VIF OpaqueRef:6b82f03d-5e3f-a7d1-5b93-8a1dff867ee9, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:31:20.891 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:21.172 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:31:24.419 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating VHD '416fc4db-2f34-4e64-aded-e00eaaf8f6e4' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:31:24.999 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:25.626 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:25.866 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:27.103 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:27.104 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:28.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:28.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:31:28.639 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:31:28.640 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:29.320 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:31:29.498 DEBUG nova.network.base_api [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:31:29.533 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:31:29.754 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:31:29.774 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:29.860 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:31:29.861 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:31:30.075 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:31:30.076 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:31:30.076 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:30.085 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:30.085 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:30.554 DEBUG nova.virt.xenapi.vmops [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:30.967 DEBUG nova.compute.manager [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:31:31.309 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:31:31.351 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:31.446 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:31.456 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:31:31.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:31.611 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:31.620 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:31.621 DEBUG oslo_concurrency.lockutils [req-ed0439b7-b975-469f-bb97-29d3052a393a tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "_locked_do_build_and_run_instance" :: held 39.794s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:31.622 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:31.622 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:32.038 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:31:32.039 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:31:32.039 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:32.052 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-d28988ae-6848-475f-b43e-bc63166c3c1e" released by "update_hostname" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:32.052 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:32.328 DEBUG nova.virt.xenapi.vmops [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:32.379 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.933s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:32.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:32.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:32.570 DEBUG nova.compute.manager [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:31:32.831 INFO nova.compute.manager [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] instance snapshotting 2015-08-07 17:31:32.837 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:31:32.884 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:32.955 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:31:33.012 DEBUG nova.compute.manager [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:31:33.380 DEBUG oslo_concurrency.lockutils [req-d0ce73c5-ea8b-4f23-acc9-1ee556eb08e9 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e" released by "_locked_do_build_and_run_instance" :: held 41.235s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:33.536 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.652s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:33.546 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD b5dbc5ef-22f1-4273-a6ba-b40b27592f15 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.568 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD 5a725e72-1e78-4f9f-8449-fc9b98aa420e has parent 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.579 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD a293d2b3-e1c6-4a76-a23e-da29cc4fd913 has parent 5a725e72-1e78-4f9f-8449-fc9b98aa420e _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.586 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD 1b633f24-6022-4170-b119-2c4526160faa has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.605 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD aae6b95d-a39d-43ec-bf4e-df797b00f53a has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.615 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD b5dbc5ef-22f1-4273-a6ba-b40b27592f15 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.628 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD 416fc4db-2f34-4e64-aded-e00eaaf8f6e4 has parent 1b633f24-6022-4170-b119-2c4526160faa _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.639 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:33.657 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:31:33.680 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:31:33.695 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:31:33.696 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:a2afb08c-045c-81f0-c51a-5b5dca50f33e, VDI OpaqueRef:be9b5fd5-741e-39b6-b2fc-713a6430527f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:33.707 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:4f5b4009-0f6e-bd39-a4d4-d36e3079cc1d for VM OpaqueRef:a2afb08c-045c-81f0-c51a-5b5dca50f33e, VDI OpaqueRef:be9b5fd5-741e-39b6-b2fc-713a6430527f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:34.836 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:34.837 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:34.837 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:34.839 INFO nova.compute.manager [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Terminating instance 2015-08-07 17:31:34.841 INFO nova.virt.xenapi.vmops [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Destroying VM 2015-08-07 17:31:34.856 DEBUG nova.virt.xenapi.vm_utils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:31:34.928 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:35.709 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:31:35.725 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:35.725 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:31:36.877 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.153s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:36.912 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD a0ee8733-1ae0-44a8-9bf2-4b14426b4d1a has parent bfcbbeef-16f0-4401-b58e-4bf82d458fe1 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:36.934 DEBUG nova.virt.xenapi.vm_utils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VHD bfcbbeef-16f0-4401-b58e-4bf82d458fe1 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:31:37.238 DEBUG nova.virt.xenapi.client.session [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:31:37.562 DEBUG nova.virt.xenapi.client.session [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Got exception: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [a19c924d-50f7-4649-a78b-ad80543ba8d9] to glance host [192.168.33.1:9292]'] _unwrap_plugin_exceptions /opt/stack/new/nova/nova/virt/xenapi/client/session.py:293 2015-08-07 17:31:38.184 DEBUG nova.compute.manager [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Cleaning up image a19c924d-50f7-4649-a78b-ad80543ba8d9 decorated_function /opt/stack/new/nova/nova/compute/manager.py:406 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Traceback (most recent call last): 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/compute/manager.py", line 402, in decorated_function 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] *args, **kwargs) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/compute/manager.py", line 2904, in snapshot_instance 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] task_states.IMAGE_SNAPSHOT) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/compute/manager.py", line 2934, in _snapshot_instance 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] update_task_state) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/driver.py", line 218, in snapshot 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] self._vmops.snapshot(context, instance, image_id, update_task_state) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/vmops.py", line 887, in snapshot 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] vdi_uuids, 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 92, in upload_image 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] 'upload_vhd', params) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 54, in _call_glance_plugin 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] 'glance', fn, CONF.glance.num_retries, pick_glance, cb, **params) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 246, in call_plugin_serialized_with_retry 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] return self.call_plugin_serialized(plugin, fn, *args, **kwargs) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 221, in call_plugin_serialized 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] rv = self.call_plugin(plugin, fn, params) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 217, in call_plugin 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] self.host_ref, plugin, fn, args) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 291, in _unwrap_plugin_exceptions 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] return func(*args, **kwargs) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 229, in __call__ 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] return self.__send(self.__name, args) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 133, in xenapi_request 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] result = _parse_result(getattr(self, methodname)(*full_params)) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 203, in _parse_result 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] raise Failure(result['ErrorDescription']) 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Failure: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [a19c924d-50f7-4649-a78b-ad80543ba8d9] to glance host [192.168.33.1:9292]'] 2015-08-07 17:31:38.184 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] 2015-08-07 17:31:38.247 ERROR nova.compute.manager [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Error while trying to clean up image a19c924d-50f7-4649-a78b-ad80543ba8d9 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Traceback (most recent call last): 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/compute/manager.py", line 408, in decorated_function 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] self.image_api.delete(context, image_id) 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/image/api.py", line 141, in delete 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] return session.delete(context, image_id) 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] File "/opt/stack/new/nova/nova/image/glance.py", line 424, in delete 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] raise exception.ImageNotFound(image_id=image_id) 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] ImageNotFound: Image a19c924d-50f7-4649-a78b-ad80543ba8d9 could not be found. 2015-08-07 17:31:38.247 13318 ERROR nova.compute.manager [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] 2015-08-07 17:31:38.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:38.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:31:38.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:38.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:38.608 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:38.726 DEBUG oslo_concurrency.lockutils [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.118s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:38.731 ERROR oslo_messaging.rpc.dispatcher [req-6a70c634-1a5c-4d15-b129-fa4ed5b87366 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Exception during message handling: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [a19c924d-50f7-4649-a78b-ad80543ba8d9] to glance host [192.168.33.1:9292]'] 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last): 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher executor_callback)) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher executor_callback) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 89, in wrapped 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher payload) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/exception.py", line 72, in wrapped 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return f(self, context, *args, **kw) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 336, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher LOG.warning(msg, e, instance_uuid=instance_uuid) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 307, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 364, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher kwargs['instance'], e, sys.exc_info()) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 352, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return function(self, context, *args, **kwargs) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 412, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher instance=instance) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_utils/excutils.py", line 119, in __exit__ 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher six.reraise(self.type_, self.value, self.tb) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 402, in decorated_function 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher *args, **kwargs) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 2904, in snapshot_instance 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher task_states.IMAGE_SNAPSHOT) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/compute/manager.py", line 2934, in _snapshot_instance 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher update_task_state) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/driver.py", line 218, in snapshot 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher self._vmops.snapshot(context, instance, image_id, update_task_state) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/vmops.py", line 887, in snapshot 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher vdi_uuids, 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 92, in upload_image 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher 'upload_vhd', params) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/image/glance.py", line 54, in _call_glance_plugin 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher 'glance', fn, CONF.glance.num_retries, pick_glance, cb, **params) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 246, in call_plugin_serialized_with_retry 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return self.call_plugin_serialized(plugin, fn, *args, **kwargs) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 221, in call_plugin_serialized 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher rv = self.call_plugin(plugin, fn, params) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 217, in call_plugin 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher self.host_ref, plugin, fn, args) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/new/nova/nova/virt/xenapi/client/session.py", line 291, in _unwrap_plugin_exceptions 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return func(*args, **kwargs) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 229, in __call__ 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher return self.__send(self.__name, args) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 133, in xenapi_request 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher result = _parse_result(getattr(self, methodname)(*full_params)) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/XenAPI.py", line 203, in _parse_result 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher raise Failure(result['ErrorDescription']) 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher Failure: ['XENAPI_PLUGIN_FAILURE', 'upload_vhd', 'PluginError', 'Got Permanent Error response [404] while uploading image [a19c924d-50f7-4649-a78b-ad80543ba8d9] to glance host [192.168.33.1:9292]'] 2015-08-07 17:31:38.731 13318 ERROR oslo_messaging.rpc.dispatcher 2015-08-07 17:31:39.335 DEBUG nova.virt.xenapi.vmops [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:31:39.351 DEBUG nova.virt.xenapi.vm_utils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI aae6b95d-a39d-43ec-bf4e-df797b00f53a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:31:39.369 DEBUG nova.virt.xenapi.vm_utils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 9cfcd902-1e94-4ed6-87f0-2a226fa3ca48 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:31:39.604 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:39.604 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:39.605 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:39.607 INFO nova.compute.manager [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Terminating instance 2015-08-07 17:31:39.609 INFO nova.virt.xenapi.vmops [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Destroying VM 2015-08-07 17:31:39.617 DEBUG nova.virt.xenapi.vm_utils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:31:40.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:40.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:41.343 DEBUG nova.virt.xenapi.vmops [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:31:41.361 DEBUG nova.virt.xenapi.vm_utils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:31:41.362 DEBUG nova.compute.manager [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:31:43.231 DEBUG nova.virt.xenapi.vmops [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:31:43.244 DEBUG nova.virt.xenapi.vm_utils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI b5dbc5ef-22f1-4273-a6ba-b40b27592f15 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:31:43.259 DEBUG nova.virt.xenapi.vm_utils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI c2c80742-13a5-43d7-9dc6-400ff3d024e6 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:31:43.299 DEBUG nova.compute.manager [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:30:51Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=32,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=d28988ae-6848-475f-b43e-bc63166c3c1e,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:30:53Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:31:43.508 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:43.509 DEBUG nova.objects.instance [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid d28988ae-6848-475f-b43e-bc63166c3c1e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:43.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:43.561 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:31:43.562 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:31:43.625 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.116s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:43.863 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:43.864 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:31:43.973 DEBUG oslo_concurrency.lockutils [req-a42964a6-58d3-4315-809c-571826dbe28f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "d28988ae-6848-475f-b43e-bc63166c3c1e" released by "do_terminate_instance" :: held 9.137s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:44.115 DEBUG nova.virt.xenapi.vmops [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:31:44.131 DEBUG nova.virt.xenapi.vm_utils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:31:44.131 DEBUG nova.compute.manager [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:31:45.019 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:46.030 DEBUG nova.compute.manager [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:30:51Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=31,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:30:53Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:31:46.258 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:46.259 DEBUG nova.objects.instance [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `numa_topology' on Instance uuid e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:46.352 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.093s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:46.695 DEBUG oslo_concurrency.lockutils [req-29dc718d-1ea9-4075-87a4-6716219c5f92 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9" released by "do_terminate_instance" :: held 7.092s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:46.723 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:46.764 INFO nova.compute.manager [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Starting instance... 2015-08-07 17:31:47.238 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:47.239 DEBUG nova.compute.resource_tracker [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:31:47.246 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:31:47.247 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Total memory: 8187 MB, used: 715.00 MB 2015-08-07 17:31:47.247 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] memory limit: 12280.50 MB, free: 11565.50 MB 2015-08-07 17:31:47.248 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:31:47.248 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] disk limit not specified, defaulting to unlimited 2015-08-07 17:31:47.274 DEBUG nova.compute.resources.vcpu [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:31:47.275 DEBUG nova.compute.resources.vcpu [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:31:47.275 INFO nova.compute.claims [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Claim successful 2015-08-07 17:31:47.641 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.403s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:47.927 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:48.048 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.121s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:48.050 DEBUG nova.compute.utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:31:48.054 13318 DEBUG nova.compute.manager [-] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:31:48.056 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:31:48.647 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:31:48.671 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:31:48.672 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:48.768 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:48.842 INFO nova.compute.manager [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Starting instance... 2015-08-07 17:31:48.967 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:31:48.976 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:49.131 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:49.132 DEBUG nova.compute.resource_tracker [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:31:49.141 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:31:49.142 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:31:49.142 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:31:49.143 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:31:49.143 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] disk limit not specified, defaulting to unlimited 2015-08-07 17:31:49.168 DEBUG nova.compute.resources.vcpu [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:31:49.168 DEBUG nova.compute.resources.vcpu [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:31:49.169 INFO nova.compute.claims [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Claim successful 2015-08-07 17:31:49.458 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 5.595s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:49.540 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "instance_claim" :: held 0.409s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:49.705 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:31:49.705 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:31:49.706 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:31:49.706 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:50.208 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 2015-08-07 17:31:50.209 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:50.391 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:31:50.391 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=853MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:31:50.462 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:31:50.462 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.756s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:50.463 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.680s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:50.465 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:50.560 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.097s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:50.561 DEBUG nova.compute.utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:31:50.568 13318 DEBUG nova.compute.manager [-] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:31:50.570 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:31:50.629 13318 DEBUG nova.network.base_api [-] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:47:30:9f', 'active': False, 'type': u'bridge', 'id': u'f2216b92-5386-455e-898f-3d47073ad4b2', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:31:50.667 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:31:50.667 13318 DEBUG nova.compute.manager [-] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:47:30:9f', 'active': False, 'type': u'bridge', 'id': u'f2216b92-5386-455e-898f-3d47073ad4b2', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:31:51.725 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:31:51.739 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:31:51.740 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:31:52.049 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:31:52.054 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:31:52.072 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:62ffd5fe-6646-dc4a-14d6-9dc834b6adde for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:31:52.073 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:62ffd5fe-6646-dc4a-14d6-9dc834b6adde ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:31:52.074 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:31:52.120 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:31:53.620 13318 DEBUG nova.network.base_api [-] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ad:3a:bb', 'active': False, 'type': u'bridge', 'id': u'fcb0663a-6865-4ad1-a7d6-df79402b8d3f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:31:53.653 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:31:53.653 13318 DEBUG nova.compute.manager [-] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ad:3a:bb', 'active': False, 'type': u'bridge', 'id': u'fcb0663a-6865-4ad1-a7d6-df79402b8d3f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:31:53.695 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.621s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:53.696 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:62ffd5fe-6646-dc4a-14d6-9dc834b6adde done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:31:53.700 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VBD OpaqueRef:62ffd5fe-6646-dc4a-14d6-9dc834b6adde plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:31:53.788 WARNING nova.virt.configdrive [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:31:53.789 DEBUG nova.objects.instance [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `ec2_ids' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:31:53.824 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): genisoimage -o /tmp/tmpUxNgeb/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7dyAnp execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:53.925 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "genisoimage -o /tmp/tmpUxNgeb/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7dyAnp" returned: 0 in 0.101s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:31:53.931 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUxNgeb/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:31:54.924 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:56.464 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:31:56.466 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 34.05 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:31:56.566 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:a695ca5b-be52-a283-7b90-f9adf3ac3237 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:31:57.569 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 8.593s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:31:57.570 INFO nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 8.60 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:31:57.571 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 5.431s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:00.486 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:00.534 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Cloned VDI OpaqueRef:7d2af9b6-899e-5137-5a26-3a8d84d202ca from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:32:00.804 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:00.921 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUxNgeb/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.990s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:00.923 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:01.088 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:32:01.106 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:32:01.107 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:01.353 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:a695ca5b-be52-a283-7b90-f9adf3ac3237 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:01.355 DEBUG oslo_concurrency.processutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.432s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:01.355 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:32:01.356 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:01.381 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:4e9dfe5e-f206-5942-fb03-3d9a77bf5ca6 for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:a695ca5b-be52-a283-7b90-f9adf3ac3237. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:02.470 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:32:02.474 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:02.489 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:cb299fee-043c-2db6-5e5c-9da965d1bc3e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:02.490 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:cb299fee-043c-2db6-5e5c-9da965d1bc3e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:32:02.643 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.072s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:02.644 INFO nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Image creation data, cacheable: True, downloaded: False duration: 10.52 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:32:03.067 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.711s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:03.068 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.578s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:03.091 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:32:03.092 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:a2afb08c-045c-81f0-c51a-5b5dca50f33e, VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:03.110 DEBUG nova.virt.xenapi.vm_utils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:91ce3599-d873-0c4a-e0e4-339cf41e427b for VM OpaqueRef:a2afb08c-045c-81f0-c51a-5b5dca50f33e, VDI OpaqueRef:d58b6ba3-161c-b833-0230-d4971d64d49b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:03.111 DEBUG nova.objects.instance [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `pci_devices' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:03.235 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:03.236 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:03.236 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:03.245 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:03.246 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:32:03.247 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:03.483 DEBUG oslo_concurrency.lockutils [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_nwinfo" :: held 0.236s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:03.483 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:32:03.492 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:32:03.503 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VIF OpaqueRef:17bd4b8b-7454-c42a-0f80-6d7aa8be0fdc, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:32:03.504 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:32:03.615 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:03.900 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:04.152 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:32:04.168 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:32:04.168 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:04.479 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:5b994296-1189-8075-68e1-ee979bd9e1e8, VDI OpaqueRef:7d2af9b6-899e-5137-5a26-3a8d84d202ca ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:04.489 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:4a1de369-afea-faa0-cf7f-d9193edc7be4 for VM OpaqueRef:5b994296-1189-8075-68e1-ee979bd9e1e8, VDI OpaqueRef:7d2af9b6-899e-5137-5a26-3a8d84d202ca. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:04.940 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:05.074 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:05.074 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:cb299fee-043c-2db6-5e5c-9da965d1bc3e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:32:05.078 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:cb299fee-043c-2db6-5e5c-9da965d1bc3e plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:32:05.148 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:32:05.152 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:05.163 WARNING nova.virt.configdrive [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:32:05.164 DEBUG nova.objects.instance [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid 47eca748-3e73-4f2b-80e9-a3e058bbd8a4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:05.176 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:2ef64df2-8523-c02f-9ace-2f68fa041c8f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:05.177 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:2ef64df2-8523-c02f-9ace-2f68fa041c8f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:32:05.177 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:05.218 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmp2WRI8A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpiNPbiE execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:05.350 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmp2WRI8A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpiNPbiE" returned: 0 in 0.131s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:05.355 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2WRI8A/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:08.461 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.283s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:08.464 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Plugging VBD OpaqueRef:2ef64df2-8523-c02f-9ace-2f68fa041c8f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:32:08.486 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VBD OpaqueRef:2ef64df2-8523-c02f-9ace-2f68fa041c8f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:32:08.646 WARNING nova.virt.configdrive [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:32:08.647 DEBUG nova.objects.instance [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `ec2_ids' on Instance uuid 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:08.699 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): genisoimage -o /tmp/tmpVFTGjq/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7xs7iJ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:08.826 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "genisoimage -o /tmp/tmpVFTGjq/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7xs7iJ" returned: 0 in 0.127s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:08.834 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVFTGjq/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:14.495 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:32:14.514 DEBUG nova.virt.xenapi.vmops [req-9c553d3d-492d-47a0-966f-8fac0ea40f0a tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:15.306 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.56 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:16.298 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:16.299 DEBUG nova.compute.manager [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Going to confirm migration 2 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:32:18.923 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2WRI8A/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 13.568s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:18.925 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:19.040 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:32:19.193 DEBUG nova.network.base_api [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:32:19.236 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:32:19.249 WARNING nova.virt.xenapi.vm_utils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM already halted, skipping shutdown... 2015-08-07 17:32:19.276 DEBUG nova.virt.xenapi.vmops [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:32:19.287 DEBUG nova.virt.xenapi.vm_utils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI 416fc4db-2f34-4e64-aded-e00eaaf8f6e4 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:32:19.296 DEBUG nova.virt.xenapi.vm_utils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI 22ce8ce3-ea0c-4055-af37-a7cd4bfbefc1 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:32:19.968 DEBUG oslo_concurrency.processutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.043s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:19.970 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:32:19.971 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:21.053 DEBUG nova.virt.xenapi.vmops [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:32:21.068 DEBUG nova.virt.xenapi.vm_utils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:32:21.304 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:21.526 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "drop_move_claim" :: held 0.222s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:21.754 DEBUG oslo_concurrency.lockutils [req-56e9e1e4-b94b-4b6e-b2ea-d5b1d84543ec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "do_confirm_resize" :: held 5.455s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:22.143 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVFTGjq/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 13.310s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:22.145 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:22.254 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.284s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:22.266 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:32:22.267 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:22.278 DEBUG nova.virt.xenapi.vm_utils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:bd1f59d1-76d5-967c-79d6-f727378a45da for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:2a40b85c-9806-5c38-49ba-6a895ed80ff0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:22.279 DEBUG nova.objects.instance [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid 47eca748-3e73-4f2b-80e9-a3e058bbd8a4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:22.409 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:22.607 DEBUG oslo_concurrency.processutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.462s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:22.608 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:32:22.609 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:22.695 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:22.696 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:22.696 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:22.715 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "store_auto_disk_config" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:22.716 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Injecting hostname (tempest.common.compute-instance-1189338620) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:32:22.716 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:22.725 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:22.726 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:32:22.727 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:23.075 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "update_nwinfo" :: held 0.348s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:23.076 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:23.401 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:32:23.423 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:32:23.436 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Created VIF OpaqueRef:8153096d-ffd4-6fcc-39e0-1eade787dc5f, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:32:23.437 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:23.898 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:32:24.160 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.551s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:24.171 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Destroying VBD for VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:32:24.171 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Creating disk-type VBD for VM OpaqueRef:5b994296-1189-8075-68e1-ee979bd9e1e8, VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:24.181 DEBUG nova.virt.xenapi.vm_utils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Created VBD OpaqueRef:86684b1e-a60c-d189-4927-d24f8b873b0f for VM OpaqueRef:5b994296-1189-8075-68e1-ee979bd9e1e8, VDI OpaqueRef:626f6f43-e2ee-83b8-5ffd-037055e4531a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:24.182 DEBUG nova.objects.instance [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `pci_devices' on Instance uuid 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:24.346 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:24.386 DEBUG nova.compute.manager [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:32:24.633 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:24.634 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:24.634 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:24.647 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "store_auto_disk_config" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:24.648 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Injecting hostname (tempest.common.compute-instance-1290402329) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:32:24.649 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:24.660 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:24.661 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:32:24.661 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:24.677 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:24.678 DEBUG nova.compute.resource_tracker [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Memory overhead for 64 MB instance; 5 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:32:24.690 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:32:24.690 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:32:24.691 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:32:24.691 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:32:24.692 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] disk limit not specified, defaulting to unlimited 2015-08-07 17:32:24.717 DEBUG nova.compute.resources.vcpu [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:32:24.718 DEBUG nova.compute.resources.vcpu [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:32:24.718 INFO nova.compute.claims [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Claim successful 2015-08-07 17:32:24.762 INFO nova.compute.resource_tracker [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Updating from migration 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 2015-08-07 17:32:24.866 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "resize_claim" :: held 0.189s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:24.867 INFO nova.compute.manager [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating 2015-08-07 17:32:24.974 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:24.993 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:32:25.002 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "update_nwinfo" :: held 0.340s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:25.002 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:25.170 DEBUG nova.network.base_api [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:32:25.202 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:32:25.245 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:32:25.261 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:32:25.270 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Created VIF OpaqueRef:6f306be8-57a1-f8c8-1a5d-568d6cfe712e, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:32:25.271 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:25.543 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:25.563 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:32:25.810 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:32:25.830 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:25.831 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:26.451 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.621s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:26.460 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD a293d2b3-e1c6-4a76-a23e-da29cc4fd913 has parent 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:26.491 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 0c4abba1-b631-431f-9406-b94b9e28c5c5 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:26.498 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 91c368d5-79f1-450d-b288-5f9f9f72badd has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:26.519 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD a293d2b3-e1c6-4a76-a23e-da29cc4fd913 has parent 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:26.532 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:26.546 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:32:28.579 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:28.580 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:29.090 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.510s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:29.099 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD a293d2b3-e1c6-4a76-a23e-da29cc4fd913 has parent 51628a0f-01b9-4795-a73a-cae7371b8e87 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:29.099 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Parent 51628a0f-01b9-4795-a73a-cae7371b8e87 not yet in parent list ['4fb4a485-cee7-4d98-9e81-17b6eb1fb638'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:32:30.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:30.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:32:30.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:32:30.598 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:32:30.599 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:32:30.600 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:32:30.600 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:30.896 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:32:30.929 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:32:30.929 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:32:30.930 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:31.930 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:31.931 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:32.471 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:32:32.498 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:32.573 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:32.576 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:32.805 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:32:32.806 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:32:32.806 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:32.812 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:32.814 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:33.098 DEBUG nova.virt.xenapi.vmops [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:33.313 DEBUG nova.compute.manager [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:32:33.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:33.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:33.730 DEBUG oslo_concurrency.lockutils [req-7ececb7c-0a84-4490-b38a-c5bc5d411398 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "_locked_do_build_and_run_instance" :: held 47.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:34.100 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:34.101 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:34.783 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:32:34.865 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:35.023 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:35.304 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:32:35.305 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:32:35.305 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:35.311 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "xenstore-1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:35.312 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:35.447 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.347s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:35.454 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD a293d2b3-e1c6-4a76-a23e-da29cc4fd913 has parent 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:35.454 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Coalesce detected, because parent is: 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2118 2015-08-07 17:32:35.463 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:35.464 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:35.548 DEBUG nova.virt.xenapi.vmops [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:35.774 DEBUG nova.compute.manager [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:32:35.870 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.406s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:35.886 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VHD 2284a4aa-d739-4214-85ab-93a7199a3d45 has parent 4fb4a485-cee7-4d98-9e81-17b6eb1fb638 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:32:35.911 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:36.165 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating VHD '4fb4a485-cee7-4d98-9e81-17b6eb1fb638' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:32:36.274 DEBUG oslo_concurrency.lockutils [req-5a6e39cb-ee52-4b85-9df9-d6327a039d33 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "_locked_do_build_and_run_instance" :: held 47.506s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:37.112 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:37.114 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:37.114 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:37.116 INFO nova.compute.manager [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Terminating instance 2015-08-07 17:32:37.118 INFO nova.virt.xenapi.vmops [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Destroying VM 2015-08-07 17:32:37.131 DEBUG nova.virt.xenapi.vm_utils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:32:38.393 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:32:38.394 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:38.658 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:32:38.671 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:32:39.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:39.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:32:39.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:40.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:40.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:41.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:41.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:41.891 DEBUG nova.virt.xenapi.vmops [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:32:41.902 DEBUG nova.virt.xenapi.vm_utils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI 0c4abba1-b631-431f-9406-b94b9e28c5c5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:32:41.918 DEBUG nova.virt.xenapi.vm_utils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] VDI b28a6f02-93e3-4e1a-a874-f015792674de is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:32:43.227 DEBUG nova.virt.xenapi.vmops [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:32:43.240 DEBUG nova.virt.xenapi.vm_utils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:32:43.241 DEBUG nova.compute.manager [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:32:44.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:44.603 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:32:44.656 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:32:44.993 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:44.993 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:45.225 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Migrating VHD 'a293d2b3-e1c6-4a76-a23e-da29cc4fd913' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:32:45.372 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:45.640 DEBUG nova.compute.manager [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:31:48Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=34,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1b9e1564-c8e1-4966-8922-0b8bbf38f0f8,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:31:50Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:32:45.870 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.877s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:45.915 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:45.916 DEBUG nova.objects.instance [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lazy-loading `numa_topology' on Instance uuid 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:46.046 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "compute_resources" released by "update_usage" :: held 0.131s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:46.054 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:46.147 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:32:46.148 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:32:46.148 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=791MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:32:46.149 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:46.203 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "do_reserve" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:46.268 DEBUG nova.compute.utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of /dev/vd get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:32:46.311 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:32:46.345 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "do_reserve" :: held 0.142s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:46.479 DEBUG oslo_concurrency.lockutils [req-bd19b30d-f630-46f3-a10c-f0b8683b3dc3 tempest-ImagesOneServerNegativeTestJSON-221560100 tempest-ImagesOneServerNegativeTestJSON-346190170] Lock "1b9e1564-c8e1-4966-8922-0b8bbf38f0f8" released by "do_terminate_instance" :: held 9.366s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:46.752 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 2015-08-07 17:32:46.753 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `new_flavor' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:46.945 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:32:46.946 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=784MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:32:47.037 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:32:47.039 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.890s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:47.040 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:47.070 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "do_attach_volume" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:47.071 INFO nova.compute.manager [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Attaching volume f8f91a12-4f64-4f81-bfd9-5a81095b7f55 to /dev/xvdb 2015-08-07 17:32:47.087 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] REQ: curl -g -i -X GET http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55 -H "User-Agent: python-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}80fd61cf1aa6ed976e048bd49618d258669ebd4f" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:32:47.457 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] RESP: [200] content-length: 919 x-compute-request-id: req-0c232011-0a68-498b-a120-9376150e409b connection: keep-alive date: Fri, 07 Aug 2015 17:32:47 GMT content-type: application/json x-openstack-request-id: req-0c232011-0a68-498b-a120-9376150e409b RESP BODY: {"volume": {"attachments": [], "links": [{"href": "http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "rel": "self"}, {"href": "http://192.168.33.1:8776/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "rel": "bookmark"}], "availability_zone": "nova", "encrypted": false, "os-volume-replication:extended_status": null, "volume_type": "lvmdriver-1", "snapshot_id": null, "id": "f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "size": 1, "user_id": "4fd3959e5815444c8306831dd23748d4", "os-vol-tenant-attr:tenant_id": "b00cf207b0ba43d7b6e5bdf182aee463", "metadata": {}, "status": "attaching", "description": null, "multiattach": false, "source_volid": null, "consistencygroup_id": null, "name": null, "bootable": "false", "created_at": "2015-08-07T17:32:38.000000", "os-volume-replication:driver_data": null, "replication_status": "disabled"}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:32:47.459 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}80fd61cf1aa6ed976e048bd49618d258669ebd4f" -d '{"os-initialize_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:32:47.635 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:47.635 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:50.382 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:32:50.596 DEBUG nova.network.base_api [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:32:50.627 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:32:50.993 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:32:50.994 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:32:51.651 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:51.651 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:32:51.664 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] RESP: [200] content-length: 449 x-compute-request-id: req-508dbe0e-f3f2-4184-92cf-e364759553c6 connection: keep-alive date: Fri, 07 Aug 2015 17:32:51 GMT content-type: application/json x-openstack-request-id: req-508dbe0e-f3f2-4184-92cf-e364759553c6 RESP BODY: {"connection_info": {"driver_volume_type": "iscsi", "data": {"auth_password": "ep8rF6YVfqoUbDCP", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "target_portal": "104.130.119.114:3260", "volume_id": "f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "target_lun": 1, "access_mode": "rw", "auth_username": "MN8x5Dp7tLMXAi3CcEEN", "auth_method": "CHAP"}}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:32:51.673 DEBUG nova.virt.xenapi.volume_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] (vol_id,host,port,iqn): (f8f91a12-4f64-4f81-bfd9-5a81095b7f55,104.130.119.114,3260,iqn.2010-10.org.openstack:volume-f8f91a12-4f64-4f81-bfd9-5a81095b7f55) _parse_volume_info /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:80 2015-08-07 17:32:51.713 DEBUG nova.virt.xenapi.volume_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Introducing SR tempSR-f8f91a12-4f64-4f81-bfd9-5a81095b7f55 introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:119 2015-08-07 17:32:51.721 DEBUG nova.virt.xenapi.volume_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating PBD for SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:126 2015-08-07 17:32:51.736 DEBUG nova.virt.xenapi.volume_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:129 2015-08-07 17:32:52.039 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:32:52.040 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 39.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:52.269 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.618s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:53.491 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:32:53.509 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:32:53.510 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Auto configuring disk, attempting to resize root disk... _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:721 2015-08-07 17:32:53.511 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Skipping auto_config_disk as destination size is 0GB _auto_configure_disk /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:960 2015-08-07 17:32:53.511 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:ee6b2384-0aeb-abc8-e4f5-0605e4f94e09, VDI OpaqueRef:f3e093f6-3e51-75c5-079d-c6abf1afb2af ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:53.522 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:31e39c25-fe65-8b3b-fef9-0855da51afef for VM OpaqueRef:ee6b2384-0aeb-abc8-e4f5-0605e4f94e09, VDI OpaqueRef:f3e093f6-3e51-75c5-079d-c6abf1afb2af. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:54.035 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:32:54.040 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:32:54.057 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:0843a3c9-f99e-1da1-b4d9-87f4d241e299 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:32:54.058 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:0843a3c9-f99e-1da1-b4d9-87f4d241e299 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:32:54.058 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:32:54.935 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:32:56.220 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.162s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:32:56.221 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Plugging VBD OpaqueRef:0843a3c9-f99e-1da1-b4d9-87f4d241e299 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:32:56.226 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VBD OpaqueRef:0843a3c9-f99e-1da1-b4d9-87f4d241e299 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:32:56.324 WARNING nova.virt.configdrive [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:32:56.325 DEBUG nova.objects.instance [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `ec2_ids' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:32:56.358 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): genisoimage -o /tmp/tmpGrQakG/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp9n6sWI execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:32:56.733 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "genisoimage -o /tmp/tmpGrQakG/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp9n6sWI" returned: 0 in 0.374s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:32:56.738 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpGrQakG/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:05.038 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:06.385 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpGrQakG/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 9.646s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:33:06.390 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:06.982 DEBUG oslo_concurrency.processutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.591s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:33:06.985 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:33:06.987 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:08.469 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.482s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:08.479 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Destroying VBD for VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:33:08.479 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Creating disk-type VBD for VM OpaqueRef:ee6b2384-0aeb-abc8-e4f5-0605e4f94e09, VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:08.489 DEBUG nova.virt.xenapi.vm_utils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Created VBD OpaqueRef:85b36212-2241-c341-1ce5-e31277671319 for VM OpaqueRef:ee6b2384-0aeb-abc8-e4f5-0605e4f94e09, VDI OpaqueRef:aed803a9-3732-20c7-4dbb-466995d3aa4a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:08.491 DEBUG nova.objects.instance [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `pci_devices' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:33:08.678 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:08.679 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:08.680 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:08.698 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "store_auto_disk_config" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:08.698 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:33:08.699 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:09.010 DEBUG oslo_concurrency.lockutils [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "xenstore-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "update_nwinfo" :: held 0.311s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:09.011 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:33:09.019 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:33:09.029 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Created VIF OpaqueRef:3da40f29-b285-9a99-7974-8f14d44904f3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:33:09.030 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:33:12.749 DEBUG nova.virt.xenapi.volumeops [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Connect volume to hypervisor: {u'access_mode': u'rw', u'target_discovered': False, u'encrypted': False, u'qos_specs': None, u'target_iqn': u'iqn.2010-10.org.openstack:volume-f8f91a12-4f64-4f81-bfd9-5a81095b7f55', u'target_portal': u'104.130.119.114:3260', u'volume_id': u'f8f91a12-4f64-4f81-bfd9-5a81095b7f55', u'target_lun': 1, u'auth_password': u'ep8rF6YVfqoUbDCP', u'auth_username': u'MN8x5Dp7tLMXAi3CcEEN', u'auth_method': u'CHAP'} _connect_hypervisor_to_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:94 2015-08-07 17:33:12.781 DEBUG nova.virt.xenapi.volume_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] {'sm_config': {'LUNid': '1', 'SCSIid': '33000000100000001'}, 'managed': False, 'snapshots': [], 'allowed_operations': ['forget', 'destroy', 'copy', 'snapshot'], 'on_boot': 'persist', 'name_description': '', 'read_only': False, 'uuid': 'e55110f1-56b1-d008-537d-c3081511f123', 'storage_lock': False, 'name_label': '', 'tags': [], 'location': 'e55110f1-56b1-d008-537d-c3081511f123', 'metadata_of_pool': 'OpaqueRef:NULL', 'type': 'user', 'sharable': False, 'snapshot_time': , 'parent': 'OpaqueRef:NULL', 'missing': False, 'xenstore_data': {}, 'crash_dumps': [], 'virtual_size': '1073741824', 'is_a_snapshot': False, 'current_operations': {}, 'snapshot_of': 'OpaqueRef:NULL', 'SR': 'OpaqueRef:ddfc6981-88ff-56fa-66cc-fe8da7a25127', 'other_config': {}, 'physical_utilisation': '0', 'allow_caching': False, 'metadata_latest': False, 'VBDs': []} introduce_vdi /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:176 2015-08-07 17:33:13.934 INFO nova.virt.xenapi.volumeops [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Connected volume (vdi_uuid): e55110f1-56b1-d008-537d-c3081511f123 2015-08-07 17:33:13.935 DEBUG nova.virt.xenapi.volumeops [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Attach_volume vdi: OpaqueRef:52e44c54-cbaf-1cf4-1d4b-e60bd31ee9e8 vm: OpaqueRef:1078428a-a383-5386-4368-32608dafc487 _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:114 2015-08-07 17:33:13.936 DEBUG nova.virt.xenapi.vm_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:52e44c54-cbaf-1cf4-1d4b-e60bd31ee9e8 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:13.956 DEBUG nova.virt.xenapi.vm_utils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:de956798-597e-332a-496c-8082a1a937fa for VM OpaqueRef:1078428a-a383-5386-4368-32608dafc487, VDI OpaqueRef:52e44c54-cbaf-1cf4-1d4b-e60bd31ee9e8. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:13.974 DEBUG nova.virt.xenapi.volumeops [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD: OpaqueRef:de956798-597e-332a-496c-8082a1a937fa _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:124 2015-08-07 17:33:13.975 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:1078428a-a383-5386-4368-32608dafc487" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:14.960 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:16.249 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:1078428a-a383-5386-4368-32608dafc487" released by "synchronized_plug" :: held 2.274s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:16.250 INFO nova.virt.xenapi.volumeops [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Dev 1 attached to instance instance-00000021 2015-08-07 17:33:16.332 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}80fd61cf1aa6ed976e048bd49618d258669ebd4f" -d '{"os-attach": {"instance_uuid": "47eca748-3e73-4f2b-80e9-a3e058bbd8a4", "mountpoint": "/dev/xvdb", "mode": "rw"}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:33:18.874 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:33:18.890 DEBUG nova.virt.xenapi.vmops [req-36fbe864-5a79-4a47-90e5-e42dc3cd9bec tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:19.968 DEBUG keystoneclient.session [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] RESP: [202] date: Fri, 07 Aug 2015 17:33:19 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-708f9dd5-6cb9-45a9-b48e-ecf7d9c6bcb2 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:33:20.073 DEBUG oslo_concurrency.lockutils [req-fdf4a6bd-68a3-44e5-bcf6-f39306671617 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "do_attach_volume" :: held 33.003s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:21.224 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:21.224 DEBUG nova.compute.manager [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Going to confirm migration 3 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:33:21.543 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:21.544 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:21.545 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:21.547 INFO nova.compute.manager [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Terminating instance 2015-08-07 17:33:21.549 INFO nova.virt.xenapi.vmops [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Destroying VM 2015-08-07 17:33:21.563 DEBUG nova.virt.xenapi.vm_utils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:33:22.847 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Acquired semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:33:23.035 DEBUG nova.network.base_api [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:87:f6:bc', 'active': False, 'type': u'bridge', 'id': u'0082deb6-303e-4fa7-969e-93e52d4e94f6', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:33:23.069 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Releasing semaphore "refresh_cache-8e5ee78c-20e2-4483-ab75-b109bb2fdca6" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:33:23.084 WARNING nova.virt.xenapi.vm_utils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM already halted, skipping shutdown... 2015-08-07 17:33:23.114 DEBUG nova.virt.xenapi.vmops [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:33:23.133 DEBUG nova.virt.xenapi.vm_utils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI a293d2b3-e1c6-4a76-a23e-da29cc4fd913 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:23.146 DEBUG nova.virt.xenapi.vm_utils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI ee911dde-6f05-48a7-a11a-ff0be43a4f85 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:24.373 DEBUG nova.virt.xenapi.vmops [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:33:24.388 DEBUG nova.virt.xenapi.vm_utils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:33:24.451 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:24.453 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "drop_move_claim" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:24.770 DEBUG oslo_concurrency.lockutils [req-e2ae098d-1fde-425a-8fc3-b3b64e741cca tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "do_confirm_resize" :: held 3.546s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:24.958 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:26.779 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:26.781 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:26.781 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:26.783 INFO nova.compute.manager [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Terminating instance 2015-08-07 17:33:26.785 INFO nova.virt.xenapi.vmops [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VM 2015-08-07 17:33:26.820 DEBUG nova.virt.xenapi.vm_utils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:33:29.823 DEBUG nova.virt.xenapi.volume_utils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Forgetting SR... forget_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:237 2015-08-07 17:33:31.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:31.590 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:31.812 DEBUG nova.virt.xenapi.vmops [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:33:31.908 DEBUG nova.virt.xenapi.vm_utils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 91c368d5-79f1-450d-b288-5f9f9f72badd is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:31.928 DEBUG nova.virt.xenapi.vm_utils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 79e807cb-1202-4aed-8a50-4564f0ff48a9 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:32.599 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:32.600 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:32.600 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:33:32.601 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:33:32.711 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:33:32.711 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:33:32.712 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:33:32.713 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:32.750 DEBUG nova.virt.xenapi.vmops [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:33:32.788 DEBUG nova.virt.xenapi.vm_utils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI a191b714-4370-4f7a-8a62-d4acd945ce37 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:32.808 DEBUG nova.virt.xenapi.vm_utils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] VDI cbc60231-2739-49a8-83f5-5d78b6d5cbd5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:33:33.303 DEBUG nova.virt.xenapi.vmops [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:33:33.331 DEBUG nova.virt.xenapi.vm_utils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:33:33.332 DEBUG nova.compute.manager [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:33:33.626 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:33.627 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:34.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:34.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:34.975 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:35.392 DEBUG keystoneclient.session [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}80fd61cf1aa6ed976e048bd49618d258669ebd4f" -d '{"os-terminate_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:33:35.706 DEBUG keystoneclient.session [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] RESP: [202] date: Fri, 07 Aug 2015 17:33:35 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-9d3d3441-704e-4823-85d6-3ef29749fd3c _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:33:35.707 DEBUG keystoneclient.session [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/b00cf207b0ba43d7b6e5bdf182aee463/volumes/f8f91a12-4f64-4f81-bfd9-5a81095b7f55/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}80fd61cf1aa6ed976e048bd49618d258669ebd4f" -d '{"os-detach": {"attachment_id": null}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:33:37.972 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:38.132 INFO nova.compute.manager [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Starting instance... 2015-08-07 17:33:38.609 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:38.609 DEBUG nova.compute.resource_tracker [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:33:38.617 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:33:38.618 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Total memory: 8187 MB, used: 784.00 MB 2015-08-07 17:33:38.618 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] memory limit: 12280.50 MB, free: 11496.50 MB 2015-08-07 17:33:38.619 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:33:38.619 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] disk limit not specified, defaulting to unlimited 2015-08-07 17:33:38.659 DEBUG nova.compute.resources.vcpu [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:33:38.660 DEBUG nova.compute.resources.vcpu [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:33:38.661 INFO nova.compute.claims [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Claim successful 2015-08-07 17:33:39.152 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" released by "instance_claim" :: held 0.543s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:39.501 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:39.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:39.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:33:39.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:39.692 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" released by "update_usage" :: held 0.191s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:39.694 DEBUG nova.compute.utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:33:39.703 DEBUG keystoneclient.session [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] RESP: [202] date: Fri, 07 Aug 2015 17:33:39 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-29ae055d-8524-4bf4-9601-8bfae50f9aed _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:33:39.708 13318 DEBUG nova.compute.manager [-] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:33:39.709 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:33:39.827 DEBUG nova.compute.manager [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:31:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=33,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=47eca748-3e73-4f2b-80e9-a3e058bbd8a4,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:31:48Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:33:39.832 DEBUG nova.compute.manager [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] terminating bdm BlockDeviceMapping(boot_index=None,connection_info='{"driver_volume_type": "iscsi", "serial": "f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "data": {"access_mode": "rw", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "target_portal": "104.130.119.114:3260", "volume_id": "f8f91a12-4f64-4f81-bfd9-5a81095b7f55", "target_lun": 1, "auth_password": "ep8rF6YVfqoUbDCP", "auth_username": "MN8x5Dp7tLMXAi3CcEEN", "auth_method": "CHAP"}}',created_at=2015-08-07T17:32:46Z,delete_on_termination=False,deleted=False,deleted_at=None,destination_type='volume',device_name='/dev/xvdb',device_type=None,disk_bus=None,guest_format=None,id=35,image_id=None,instance=,instance_uuid=47eca748-3e73-4f2b-80e9-a3e058bbd8a4,no_device=False,snapshot_id=None,source_type='volume',updated_at=2015-08-07T17:33:19Z,volume_id='f8f91a12-4f64-4f81-bfd9-5a81095b7f55',volume_size=1) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:33:40.172 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:40.174 DEBUG nova.objects.instance [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid 47eca748-3e73-4f2b-80e9-a3e058bbd8a4 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:33:40.340 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.169s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:40.393 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:33:40.459 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:33:40.499 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:41.057 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:33:41.084 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:41.158 DEBUG nova.virt.xenapi.vmops [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:33:41.174 DEBUG nova.virt.xenapi.vm_utils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:33:41.175 DEBUG nova.compute.manager [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:33:41.216 DEBUG oslo_concurrency.lockutils [req-ea9fabbe-5f61-499b-b027-3ba726d36249 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "47eca748-3e73-4f2b-80e9-a3e058bbd8a4" released by "do_terminate_instance" :: held 19.673s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:42.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:42.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:43.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:43.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:43.936 13318 DEBUG nova.network.base_api [-] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:77:05:5f', 'active': False, 'type': u'bridge', 'id': u'492bfbbf-d933-4edf-a161-2e0e4a960d0f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:33:43.983 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:33:43.984 13318 DEBUG nova.compute.manager [-] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:77:05:5f', 'active': False, 'type': u'bridge', 'id': u'492bfbbf-d933-4edf-a161-2e0e4a960d0f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:33:44.214 DEBUG oslo_concurrency.lockutils [req-af1b7459-07d7-425a-949b-ab1784cd578b tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "274c80c2-c6c1-4d76-8555-5b7a5f4a8029" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:44.359 INFO nova.compute.manager [req-af1b7459-07d7-425a-949b-ab1784cd578b tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 274c80c2-c6c1-4d76-8555-5b7a5f4a8029] Starting instance... 2015-08-07 17:33:44.640 DEBUG nova.compute.manager [req-af1b7459-07d7-425a-949b-ab1784cd578b tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 274c80c2-c6c1-4d76-8555-5b7a5f4a8029] Unexpected task state: expecting [u'scheduling', None] but the actual state is deleting _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1871 2015-08-07 17:33:44.778 DEBUG oslo_concurrency.lockutils [req-af1b7459-07d7-425a-949b-ab1784cd578b tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "274c80c2-c6c1-4d76-8555-5b7a5f4a8029" released by "_locked_do_build_and_run_instance" :: held 0.564s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:44.961 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:45.017 DEBUG nova.compute.manager [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:27:41Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=27,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=8e5ee78c-20e2-4483-ab75-b109bb2fdca6,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:27:43Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:33:45.279 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:45.280 DEBUG nova.objects.instance [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lazy-loading `numa_topology' on Instance uuid 8e5ee78c-20e2-4483-ab75-b109bb2fdca6 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:33:45.453 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "compute_resources" released by "update_usage" :: held 0.175s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:45.934 DEBUG oslo_concurrency.lockutils [req-ec855eb3-60ae-4414-9880-c3e316898cc5 tempest-ServerDiskConfigTestJSON-1169161174 tempest-ServerDiskConfigTestJSON-1129284864] Lock "8e5ee78c-20e2-4483-ab75-b109bb2fdca6" released by "do_terminate_instance" :: held 19.155s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:46.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:46.560 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:33:46.561 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:33:46.778 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:46.778 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:33:47.055 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:47.132 INFO nova.compute.manager [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Starting instance... 2015-08-07 17:33:47.271 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.494s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:47.357 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:47.357 DEBUG nova.compute.resource_tracker [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:33:47.365 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:33:47.366 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Total memory: 8187 MB, used: 715.00 MB 2015-08-07 17:33:47.367 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] memory limit: 12280.50 MB, free: 11565.50 MB 2015-08-07 17:33:47.367 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:33:47.367 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] disk limit not specified, defaulting to unlimited 2015-08-07 17:33:47.392 DEBUG nova.compute.resources.vcpu [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:33:47.393 DEBUG nova.compute.resources.vcpu [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:33:47.393 INFO nova.compute.claims [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Claim successful 2015-08-07 17:33:47.636 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:33:47.636 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:33:47.637 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:33:47.776 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.419s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:47.786 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.148s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:48.100 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:33:48.101 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:33:48.195 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:33:48.195 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.409s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:48.196 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.147s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:48.198 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:48.297 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.101s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:48.298 DEBUG nova.compute.utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:33:48.302 13318 DEBUG nova.compute.manager [-] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:33:48.303 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-5fe04666-4b1b-41cf-bca7-ce6c2b277477" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:33:48.920 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:33:48.932 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:33:48.933 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:49.215 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:33:50.149 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Cloned VDI OpaqueRef:1b29178c-daa0-ad88-8bec-d2ddea479b64 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:33:50.781 13318 DEBUG nova.network.base_api [-] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a1:15:3f', 'active': False, 'type': u'bridge', 'id': u'ef7c10f9-87a8-46b3-acde-c85cf230ef09', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:33:50.800 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 9.716s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:50.801 INFO nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Image creation data, cacheable: True, downloaded: False duration: 9.74 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:33:50.802 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 1.569s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:50.824 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-5fe04666-4b1b-41cf-bca7-ce6c2b277477" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:33:50.824 13318 DEBUG nova.compute.manager [-] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a1:15:3f', 'active': False, 'type': u'bridge', 'id': u'ef7c10f9-87a8-46b3-acde-c85cf230ef09', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:33:51.195 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:33:51.216 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 42.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:51.904 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:99a6bb08-925f-a069-4274-87baa227e5e7 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:33:52.057 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:52.538 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:52.986 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.184s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:52.986 INFO nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 3.77 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:33:53.005 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:33:53.017 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:33:53.018 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:53.299 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Creating disk-type VBD for VM OpaqueRef:3866d569-a88e-a8fa-e6df-a4a29321c190, VDI OpaqueRef:1b29178c-daa0-ad88-8bec-d2ddea479b64 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:53.311 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Created VBD OpaqueRef:87eef249-b4cb-dcb1-db52-7ae03a97eb40 for VM OpaqueRef:3866d569-a88e-a8fa-e6df-a4a29321c190, VDI OpaqueRef:1b29178c-daa0-ad88-8bec-d2ddea479b64. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:53.808 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:53.838 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Created VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:33:53.841 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:53.852 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Created VBD OpaqueRef:f33299f9-0cb3-281d-9a15-a0d79c5f0b3a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:53.854 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Plugging VBD OpaqueRef:f33299f9-0cb3-281d-9a15-a0d79c5f0b3a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:33:53.855 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:54.088 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:54.336 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:33:54.349 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:33:54.352 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:33:54.902 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:09a99ffd-2e2d-792d-bcda-24269fa8b94c, VDI OpaqueRef:99a6bb08-925f-a069-4274-87baa227e5e7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:54.917 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:39ee003f-66dd-9965-e542-be17261b3bc9 for VM OpaqueRef:09a99ffd-2e2d-792d-bcda-24269fa8b94c, VDI OpaqueRef:99a6bb08-925f-a069-4274-87baa227e5e7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:54.981 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:33:55.429 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:33:55.434 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:33:55.449 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:735e4e65-351b-0534-619e-3dca50a32d9b for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:33:55.450 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:735e4e65-351b-0534-619e-3dca50a32d9b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:33:55.460 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.606s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:55.461 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Plugging VBD OpaqueRef:f33299f9-0cb3-281d-9a15-a0d79c5f0b3a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:33:55.462 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:55.466 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] VBD OpaqueRef:f33299f9-0cb3-281d-9a15-a0d79c5f0b3a plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:33:55.557 WARNING nova.virt.configdrive [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:33:55.558 DEBUG nova.objects.instance [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lazy-loading `ec2_ids' on Instance uuid 1ed8c9a4-5157-4b0b-af78-9a5d74576f17 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:33:55.611 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Running cmd (subprocess): genisoimage -o /tmp/tmp9ahQ5Z/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp4UmAdJ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:55.766 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] CMD "genisoimage -o /tmp/tmp9ahQ5Z/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp4UmAdJ" returned: 0 in 0.154s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:33:55.772 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp9ahQ5Z/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:57.322 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.859s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:57.324 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:735e4e65-351b-0534-619e-3dca50a32d9b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:33:57.329 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:735e4e65-351b-0534-619e-3dca50a32d9b plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:33:57.456 WARNING nova.virt.configdrive [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:33:57.457 DEBUG nova.objects.instance [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid 5fe04666-4b1b-41cf-bca7-ce6c2b277477 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:33:57.464 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:57.501 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmp8ofQLC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPVLrAO execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:57.647 INFO nova.compute.manager [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Starting instance... 2015-08-07 17:33:57.664 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmp8ofQLC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPVLrAO" returned: 0 in 0.163s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:33:57.668 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp8ofQLC/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:33:58.094 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:58.095 DEBUG nova.compute.resource_tracker [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:33:58.107 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:33:58.108 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:33:58.109 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:33:58.109 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:33:58.111 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] disk limit not specified, defaulting to unlimited 2015-08-07 17:33:58.138 DEBUG nova.compute.resources.vcpu [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:33:58.139 DEBUG nova.compute.resources.vcpu [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:33:58.139 INFO nova.compute.claims [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Claim successful 2015-08-07 17:33:58.778 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "instance_claim" :: held 0.685s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:59.340 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:33:59.548 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.207s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:33:59.549 DEBUG nova.compute.utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:33:59.556 13318 DEBUG nova.compute.manager [-] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:33:59.557 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:34:01.579 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:34:01.595 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:34:01.596 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:03.089 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:34:03.103 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:04.879 13318 DEBUG nova.network.base_api [-] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6a:1d:d3', 'active': False, 'type': u'bridge', 'id': u'7a7699f1-6340-4bf7-8a9a-8756937921d4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:34:04.916 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:34:04.916 13318 DEBUG nova.compute.manager [-] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6a:1d:d3', 'active': False, 'type': u'bridge', 'id': u'7a7699f1-6340-4bf7-8a9a-8756937921d4', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:34:05.610 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.29 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:07.658 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Cloned VDI OpaqueRef:7d1a614f-4194-75df-d8ea-d847c1136342 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:34:08.660 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.557s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:08.661 INFO nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Image creation data, cacheable: True, downloaded: False duration: 5.57 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:34:09.887 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:10.038 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp9ahQ5Z/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 14.267s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:10.041 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:10.199 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:10.817 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:34:10.841 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:34:10.842 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:10.885 DEBUG oslo_concurrency.processutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.844s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:10.897 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Destroying VBD for VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:34:10.898 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:11.104 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:6e2a9ffd-e69a-42b0-0351-f28f88d1d006, VDI OpaqueRef:7d1a614f-4194-75df-d8ea-d847c1136342 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:11.119 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:5b57e5f1-5e50-fa10-9984-cf186b0ebae9 for VM OpaqueRef:6e2a9ffd-e69a-42b0-0351-f28f88d1d006, VDI OpaqueRef:7d1a614f-4194-75df-d8ea-d847c1136342. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:11.312 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp8ofQLC/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 13.644s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:11.314 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:11.879 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:34:11.887 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:11.915 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:ec2b6c18-c244-6f5b-f763-5bc049626486 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:11.916 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:ec2b6c18-c244-6f5b-f763-5bc049626486 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:34:11.991 DEBUG oslo_concurrency.processutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.677s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:11.992 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:34:12.221 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.322s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:12.222 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.305s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:12.246 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Destroying VBD for VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:34:12.246 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Creating disk-type VBD for VM OpaqueRef:3866d569-a88e-a8fa-e6df-a4a29321c190, VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:12.260 DEBUG nova.virt.xenapi.vm_utils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Created VBD OpaqueRef:fb7af99c-041c-20ff-a83e-7ef47bd66c37 for VM OpaqueRef:3866d569-a88e-a8fa-e6df-a4a29321c190, VDI OpaqueRef:5c82815a-20eb-88c4-c8b5-2b5963421930. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:12.261 DEBUG nova.objects.instance [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lazy-loading `pci_devices' on Instance uuid 1ed8c9a4-5157-4b0b-af78-9a5d74576f17 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:12.381 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:12.656 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:12.657 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:12.657 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:12.682 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "store_auto_disk_config" :: held 0.024s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:12.683 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Injecting hostname (tempest.common.compute-instance-788505683) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:34:12.683 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:12.733 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "update_hostname" :: held 0.049s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:12.734 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:34:12.742 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:13.062 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "update_nwinfo" :: held 0.320s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:13.064 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:13.359 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:34:13.368 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:34:13.377 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Created VIF OpaqueRef:1f9a32ee-e5bf-6b9c-ef05-a3829b65199c, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:34:13.378 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:13.625 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:34:14.046 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.824s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:14.047 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:ec2b6c18-c244-6f5b-f763-5bc049626486 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:34:14.048 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 2.055s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:14.051 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VBD OpaqueRef:ec2b6c18-c244-6f5b-f763-5bc049626486 plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:34:14.147 WARNING nova.virt.configdrive [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:34:14.148 DEBUG nova.objects.instance [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `ec2_ids' on Instance uuid f5ce5302-4adb-4ac6-be2f-b4371f1d3b62 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:14.198 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): genisoimage -o /tmp/tmpMEFxKE/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpYmXtVy execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:14.461 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "genisoimage -o /tmp/tmpMEFxKE/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpYmXtVy" returned: 0 in 0.262s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:14.466 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpMEFxKE/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:15.071 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:15.627 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.579s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:15.638 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:34:15.639 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:09a99ffd-2e2d-792d-bcda-24269fa8b94c, VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:15.652 DEBUG nova.virt.xenapi.vm_utils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:1465009f-a1cc-c4a9-85f1-16206fa459a8 for VM OpaqueRef:09a99ffd-2e2d-792d-bcda-24269fa8b94c, VDI OpaqueRef:5fd1bbf4-7dd9-c203-1c91-5a8f93f0405d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:15.654 DEBUG nova.objects.instance [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid 5fe04666-4b1b-41cf-bca7-ce6c2b277477 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:15.777 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:16.799 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:16.800 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:16.801 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:16.814 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "store_auto_disk_config" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:16.823 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Injecting hostname (tempest.common.compute-instance-81441388) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:34:16.824 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:16.837 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "update_hostname" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:16.839 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:34:16.840 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:17.071 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "update_nwinfo" :: held 0.231s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:17.073 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:17.591 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:34:17.601 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:34:17.612 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Created VIF OpaqueRef:f05eb881-4d72-00ad-ab0d-0e6b3c885b0e, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:34:17.613 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:18.162 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:34:23.104 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpMEFxKE/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 8.638s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:23.106 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:23.638 DEBUG oslo_concurrency.processutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.533s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:23.644 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:34:23.645 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:24.999 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:25.520 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.874s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:25.528 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:34:25.529 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:6e2a9ffd-e69a-42b0-0351-f28f88d1d006, VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:25.540 DEBUG nova.virt.xenapi.vm_utils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:37d940c2-e4e7-2ecf-8c30-006d73f06dd1 for VM OpaqueRef:6e2a9ffd-e69a-42b0-0351-f28f88d1d006, VDI OpaqueRef:78892124-0150-32ef-98b5-cd27721d9951. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:25.541 DEBUG nova.objects.instance [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `pci_devices' on Instance uuid f5ce5302-4adb-4ac6-be2f-b4371f1d3b62 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:25.670 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:25.970 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:25.970 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:25.971 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:25.982 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:25.983 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Injecting hostname (tempest-listserverfilterstestjson-instance-1616472520) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:34:25.984 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:25.994 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:25.995 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:34:25.996 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:26.302 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "update_nwinfo" :: held 0.307s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:26.303 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:26.546 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:34:26.583 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:26.620 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:34:26.631 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:34:26.649 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Created VIF OpaqueRef:8d406120-6dd0-a580-945d-efdbb91c3b66, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:34:26.651 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:27.026 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:34:27.027 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:34:27.027 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:27.032 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "xenstore-1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:27.033 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:27.048 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:34:27.340 DEBUG nova.virt.xenapi.vmops [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:27.610 DEBUG nova.compute.manager [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:34:27.986 DEBUG oslo_concurrency.lockutils [req-3cd3f9fa-4c34-4e5f-a2c3-0159d8fae4b5 tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "_locked_do_build_and_run_instance" :: held 50.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:29.112 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:29.115 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:29.116 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:29.118 INFO nova.compute.manager [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Terminating instance 2015-08-07 17:34:29.123 INFO nova.virt.xenapi.vmops [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Destroying VM 2015-08-07 17:34:29.195 DEBUG nova.virt.xenapi.vm_utils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:34:29.963 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:34:29.985 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:30.439 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:34:30.439 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:34:30.440 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:30.445 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:30.446 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:30.722 DEBUG nova.virt.xenapi.vmops [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:31.015 DEBUG nova.compute.manager [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:34:31.398 DEBUG oslo_concurrency.lockutils [req-763716c0-67ce-4891-9fb5-fa9ff10dda34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "_locked_do_build_and_run_instance" :: held 44.343s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:32.555 INFO nova.compute.manager [req-40ac1067-ade8-4ea1-8f8d-6c21f9f4761f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Pausing 2015-08-07 17:34:32.677 DEBUG nova.compute.manager [req-40ac1067-ade8-4ea1-8f8d-6c21f9f4761f tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:34:33.371 DEBUG nova.virt.xenapi.vmops [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:34:33.379 DEBUG nova.virt.xenapi.vm_utils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] VDI 49495708-856f-4511-a7d4-16df9e25d2bf is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:34:33.389 DEBUG nova.virt.xenapi.vm_utils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] VDI b0adbb3e-faad-4f78-9515-7d285c23dacb is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:34:33.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:33.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:34.414 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:34.415 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:34.416 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:34.417 INFO nova.compute.manager [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Terminating instance 2015-08-07 17:34:34.419 INFO nova.virt.xenapi.vmops [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Destroying VM 2015-08-07 17:34:34.430 DEBUG nova.virt.xenapi.vmops [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:34:34.433 DEBUG nova.virt.xenapi.vm_utils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:34:34.452 DEBUG nova.virt.xenapi.vm_utils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:34:34.453 DEBUG nova.compute.manager [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:34:34.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:34.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:34:34.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:34:34.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:34:34.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:34:34.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:34:34.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:34:34.604 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:35.016 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:35.594 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:35.595 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:36.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:36.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:36.650 DEBUG nova.compute.manager [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:33:37Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=36,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1ed8c9a4-5157-4b0b-af78-9a5d74576f17,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:33:39Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:34:37.462 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:37.464 DEBUG nova.objects.instance [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lazy-loading `numa_topology' on Instance uuid 1ed8c9a4-5157-4b0b-af78-9a5d74576f17 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:37.625 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:34:37.668 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:37.791 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "compute_resources" released by "update_usage" :: held 0.329s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:38.407 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:34:38.408 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:34:38.408 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:38.415 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:38.419 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:38.448 DEBUG oslo_concurrency.lockutils [req-3d642c3a-7736-498b-8bd5-d5506e4bbf5f tempest-InstanceActionsNegativeTestJSON-244825089 tempest-InstanceActionsNegativeTestJSON-519695908] Lock "1ed8c9a4-5157-4b0b-af78-9a5d74576f17" released by "do_terminate_instance" :: held 9.336s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:38.740 DEBUG nova.virt.xenapi.vmops [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:38.990 DEBUG nova.virt.xenapi.vmops [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:34:39.023 DEBUG nova.virt.xenapi.vm_utils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI e85e080a-c08f-4e13-a53a-6ab7e1a2fb1c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:34:39.035 DEBUG nova.virt.xenapi.vm_utils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 0e0c7363-b3dc-4849-826e-e5ac1f033e51 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:34:39.134 DEBUG nova.compute.manager [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:34:39.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:39.531 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:34:39.533 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:39.660 DEBUG oslo_concurrency.lockutils [req-f7491b13-bf0a-4a51-8fba-6a2efbb5b04c tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "_locked_do_build_and_run_instance" :: held 42.196s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:40.241 DEBUG nova.virt.xenapi.vmops [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:34:40.255 DEBUG nova.virt.xenapi.vm_utils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:34:40.255 DEBUG nova.compute.manager [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:34:41.564 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:41.769 INFO nova.compute.manager [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Starting instance... 2015-08-07 17:34:42.057 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:42.058 DEBUG nova.compute.resource_tracker [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:34:42.065 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:34:42.066 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:34:42.066 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:34:42.067 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:34:42.067 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] disk limit not specified, defaulting to unlimited 2015-08-07 17:34:42.093 DEBUG nova.compute.resources.vcpu [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:34:42.094 DEBUG nova.compute.resources.vcpu [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:34:42.094 INFO nova.compute.claims [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Claim successful 2015-08-07 17:34:42.520 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "instance_claim" :: held 0.463s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:42.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:42.531 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:42.661 DEBUG nova.compute.manager [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:33:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=38,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=5fe04666-4b1b-41cf-bca7-ce6c2b277477,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:33:48Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:34:42.919 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:43.022 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.104s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:43.024 DEBUG nova.compute.utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:34:43.028 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.100s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:43.029 DEBUG nova.objects.instance [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid 5fe04666-4b1b-41cf-bca7-ce6c2b277477 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:43.032 13318 DEBUG nova.compute.manager [-] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:34:43.033 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-5722ce9b-957b-4a66-bb52-a0f639736797" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:34:43.133 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:43.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:43.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:43.538 DEBUG oslo_concurrency.lockutils [req-a186a547-0f35-4d02-9831-4258d67eb44e tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "5fe04666-4b1b-41cf-bca7-ce6c2b277477" released by "do_terminate_instance" :: held 9.124s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:43.690 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:34:43.707 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:34:43.708 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:44.117 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:34:44.127 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:45.013 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:45.391 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Cloned VDI OpaqueRef:108faea2-73ef-b33d-ae86-a9551ea723f5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:34:46.263 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:46.265 INFO nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Image creation data, cacheable: True, downloaded: False duration: 2.15 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:34:46.581 13318 DEBUG nova.network.base_api [-] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:2f:08', 'active': False, 'type': u'bridge', 'id': u'68b7a8ad-49fa-46ce-a873-dfe17b6bae9c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:34:46.615 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-5722ce9b-957b-4a66-bb52-a0f639736797" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:34:46.621 13318 DEBUG nova.compute.manager [-] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:2f:08', 'active': False, 'type': u'bridge', 'id': u'68b7a8ad-49fa-46ce-a873-dfe17b6bae9c', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:34:46.658 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:46.713 INFO nova.compute.manager [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Starting instance... 2015-08-07 17:34:46.963 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:46.964 DEBUG nova.compute.resource_tracker [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:34:46.973 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:34:46.975 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:34:46.975 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:34:46.977 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:34:46.977 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] disk limit not specified, defaulting to unlimited 2015-08-07 17:34:47.009 DEBUG nova.compute.resources.vcpu [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:34:47.010 DEBUG nova.compute.resources.vcpu [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:34:47.010 INFO nova.compute.claims [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Claim successful 2015-08-07 17:34:47.062 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:47.202 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:47.267 INFO nova.compute.manager [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Starting instance... 2015-08-07 17:34:47.357 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:47.431 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:47.471 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.508s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:47.532 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:47.535 INFO nova.compute.manager [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Starting instance... 2015-08-07 17:34:47.665 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:34:47.666 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:34:47.671 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:47.671 DEBUG nova.compute.resource_tracker [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:34:47.683 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:34:47.683 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:34:47.684 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:34:47.684 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:34:47.686 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] disk limit not specified, defaulting to unlimited 2015-08-07 17:34:47.827 DEBUG nova.compute.resources.vcpu [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:34:47.827 DEBUG nova.compute.resources.vcpu [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:34:47.828 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Claim successful 2015-08-07 17:34:47.850 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:34:47.914 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:34:47.931 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:47.956 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:47.969 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:34:48.253 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:8995081d-d812-5afd-4f79-272b013178c6, VDI OpaqueRef:108faea2-73ef-b33d-ae86-a9551ea723f5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:48.264 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:daced03c-2f3f-e16d-4c19-e6a1fc420f06 for VM OpaqueRef:8995081d-d812-5afd-4f79-272b013178c6, VDI OpaqueRef:108faea2-73ef-b33d-ae86-a9551ea723f5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:48.294 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "instance_claim" :: held 0.624s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:48.301 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.445s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:48.657 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.701s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:48.748 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.447s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:48.761 DEBUG nova.compute.utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:34:48.769 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "instance_claim" :: waited 0.793s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:48.770 DEBUG nova.compute.resource_tracker [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:34:48.777 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:34:48.778 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:34:48.779 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:34:48.779 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:34:48.779 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] disk limit not specified, defaulting to unlimited 2015-08-07 17:34:48.782 13318 DEBUG nova.compute.manager [-] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:34:48.783 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-ca1ed49f-ef0a-4a98-b04f-6c956be89835" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:34:48.821 DEBUG nova.compute.resources.vcpu [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:34:48.822 DEBUG nova.compute.resources.vcpu [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:34:48.823 INFO nova.compute.claims [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Claim successful 2015-08-07 17:34:49.012 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:34:49.013 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:34:49.014 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:34:49.287 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "instance_claim" :: held 0.518s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:49.294 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.495s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:49.424 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.129s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:49.425 DEBUG nova.compute.utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:34:49.429 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.415s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:49.431 13318 DEBUG nova.compute.manager [-] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:34:49.432 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:34:49.567 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:34:49.588 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:34:49.589 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:50.007 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:34:50.027 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:50.168 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:34:50.196 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:34:50.270 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:34:50.292 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:34:50.293 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:50.307 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:34:50.308 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.879s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:50.308 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.649s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:50.313 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:50.450 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.142s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:50.452 DEBUG nova.compute.utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:34:50.458 13318 DEBUG nova.compute.manager [-] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:34:50.459 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-962e3fc3-68c5-4019-beb9-2b9f939eb511" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:34:50.678 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:34:51.275 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:34:51.289 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:34:51.290 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:51.518 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:0708a3e0-be6d-03cc-b0de-456f1cce611e from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:34:51.701 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:34:52.301 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:34:52.302 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.22 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:52.407 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.380s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:52.407 INFO nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 2.40 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:34:52.408 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 1.679s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:52.496 13318 DEBUG nova.network.base_api [-] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6f:37:5e', 'active': False, 'type': u'bridge', 'id': u'e6c4302b-7154-4e57-9602-6a296335599b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:34:52.538 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-ca1ed49f-ef0a-4a98-b04f-6c956be89835" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:34:52.539 13318 DEBUG nova.compute.manager [-] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6f:37:5e', 'active': False, 'type': u'bridge', 'id': u'e6c4302b-7154-4e57-9602-6a296335599b', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:34:53.476 13318 DEBUG nova.network.base_api [-] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:fb:50:47', 'active': False, 'type': u'bridge', 'id': u'4587afa2-0492-42fd-bf6d-79d9a181edde', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:34:53.516 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:34:53.517 13318 DEBUG nova.compute.manager [-] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:fb:50:47', 'active': False, 'type': u'bridge', 'id': u'4587afa2-0492-42fd-bf6d-79d9a181edde', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:34:53.848 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Cloned VDI OpaqueRef:d2c2aa02-45c9-3060-5c28-336d5e600342 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:34:54.005 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:54.076 13318 DEBUG nova.network.base_api [-] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:3c:74', 'active': False, 'type': u'bridge', 'id': u'f111c10a-8a41-492e-9079-1dd3213636a1', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:34:54.118 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-962e3fc3-68c5-4019-beb9-2b9f939eb511" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:34:54.119 13318 DEBUG nova.compute.manager [-] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:3c:74', 'active': False, 'type': u'bridge', 'id': u'f111c10a-8a41-492e-9079-1dd3213636a1', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:34:54.281 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:54.347 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:34:54.351 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:54.366 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:d316f6fd-37c8-60f1-e809-1b9f16e8f47c for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:54.367 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:d316f6fd-37c8-60f1-e809-1b9f16e8f47c ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:34:54.368 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:54.504 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:34:54.519 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:34:54.519 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:54.664 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.256s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:54.665 INFO nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Image creation data, cacheable: True, downloaded: False duration: 3.99 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:34:54.666 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 2.950s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:54.771 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:ae48cb5b-edd0-e9cd-5f0c-229f29d2e43d, VDI OpaqueRef:0708a3e0-be6d-03cc-b0de-456f1cce611e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:54.784 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:db70fc3e-6e54-132f-957c-a3a3ac45e35e for VM OpaqueRef:ae48cb5b-edd0-e9cd-5f0c-229f29d2e43d, VDI OpaqueRef:0708a3e0-be6d-03cc-b0de-456f1cce611e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:55.020 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:34:55.917 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Cloned VDI OpaqueRef:2e8ebac7-b1e5-6feb-260c-18a6619c64ba from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:34:56.124 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:56.404 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:56.521 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.153s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:56.521 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:d316f6fd-37c8-60f1-e809-1b9f16e8f47c done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:34:56.525 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VBD OpaqueRef:d316f6fd-37c8-60f1-e809-1b9f16e8f47c plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:34:56.620 WARNING nova.virt.configdrive [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:34:56.625 DEBUG nova.objects.instance [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `ec2_ids' on Instance uuid 5722ce9b-957b-4a66-bb52-a0f639736797 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:34:56.665 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): genisoimage -o /tmp/tmp2r5A1r/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqxecRl execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:56.767 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:34:56.772 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "genisoimage -o /tmp/tmp2r5A1r/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqxecRl" returned: 0 in 0.107s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:34:56.776 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2r5A1r/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:34:56.866 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:34:56.870 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:56.889 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.222s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:34:56.890 INFO nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Image creation data, cacheable: True, downloaded: False duration: 5.19 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:34:57.229 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:3ea0bf04-0860-3c75-160e-ef37bbd15bb7, VDI OpaqueRef:d2c2aa02-45c9-3060-5c28-336d5e600342 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:57.240 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:b2cb1cc5-fe96-df24-3176-accccd205443 for VM OpaqueRef:3ea0bf04-0860-3c75-160e-ef37bbd15bb7, VDI OpaqueRef:d2c2aa02-45c9-3060-5c28-336d5e600342. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:57.918 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:34:57.924 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:57.935 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:cb5097d3-33b4-8006-b738-43d9c0d56a92 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:57.936 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:cb5097d3-33b4-8006-b738-43d9c0d56a92 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:34:57.937 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:34:58.171 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:58.478 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:58.905 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:34:59.705 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:34:59.706 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:34:59.791 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:34:59.795 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:34:59.807 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:52e7f872-7752-dd1c-dbe5-b9f435bf463e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:34:59.807 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:52e7f872-7752-dd1c-dbe5-b9f435bf463e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:35:00.250 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:96ba22ab-9534-884f-7f1b-4485754698fa, VDI OpaqueRef:2e8ebac7-b1e5-6feb-260c-18a6619c64ba ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:00.258 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:e5d2ef0a-d3d6-0da4-5d1a-fb189a134a4f for VM OpaqueRef:96ba22ab-9534-884f-7f1b-4485754698fa, VDI OpaqueRef:2e8ebac7-b1e5-6feb-260c-18a6619c64ba. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:00.826 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:35:00.830 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:00.843 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:93e8f099-a98f-10fa-668e-c358c1adf8f6 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:00.844 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:93e8f099-a98f-10fa-668e-c358c1adf8f6 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:35:01.195 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.258s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:01.195 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:cb5097d3-33b4-8006-b738-43d9c0d56a92 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:35:01.197 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 1.389s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:01.216 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VBD OpaqueRef:cb5097d3-33b4-8006-b738-43d9c0d56a92 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:35:01.339 WARNING nova.virt.configdrive [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:35:01.340 DEBUG nova.objects.instance [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `ec2_ids' on Instance uuid e9775bc6-34f1-465c-9ea5-54d4b3d5a076 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:01.378 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): genisoimage -o /tmp/tmpXqCWTZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRzK0jU execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:01.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:01.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 25.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:01.531 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "genisoimage -o /tmp/tmpXqCWTZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRzK0jU" returned: 0 in 0.153s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:01.536 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXqCWTZ/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:04.028 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.831s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:04.031 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:52e7f872-7752-dd1c-dbe5-b9f435bf463e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:35:04.033 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 3.189s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:04.041 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:52e7f872-7752-dd1c-dbe5-b9f435bf463e plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:35:04.155 WARNING nova.virt.configdrive [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:35:04.156 DEBUG nova.objects.instance [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid ca1ed49f-ef0a-4a98-b04f-6c956be89835 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:04.194 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmp16fbc4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpgig2lz execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:04.442 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmp16fbc4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpgig2lz" returned: 0 in 0.248s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:04.448 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp16fbc4/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:05.151 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.75 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:06.689 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2r5A1r/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 9.913s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:06.691 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:08.084 DEBUG oslo_concurrency.processutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.392s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:08.087 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:35:10.702 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 6.669s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:10.703 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:93e8f099-a98f-10fa-668e-c358c1adf8f6 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:35:10.705 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 2.617s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:10.708 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VBD OpaqueRef:93e8f099-a98f-10fa-668e-c358c1adf8f6 plugged as xvdf vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:35:10.817 WARNING nova.virt.configdrive [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:35:10.818 DEBUG nova.objects.instance [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `ec2_ids' on Instance uuid 962e3fc3-68c5-4019-beb9-2b9f939eb511 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:10.877 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): genisoimage -o /tmp/tmp4ev_mR/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp014rwe execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:10.977 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "genisoimage -o /tmp/tmp4ev_mR/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp014rwe" returned: 0 in 0.100s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:10.983 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4ev_mR/configdrive of=/dev/xvdf oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:12.507 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.802s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:12.518 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:35:12.519 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:8995081d-d812-5afd-4f79-272b013178c6, VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:12.531 DEBUG nova.virt.xenapi.vm_utils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:49d04d7e-c07f-03e8-552a-960c1c54e109 for VM OpaqueRef:8995081d-d812-5afd-4f79-272b013178c6, VDI OpaqueRef:40e599b9-ba1e-419d-0149-8ae6e9df9f8e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:12.533 DEBUG nova.objects.instance [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `pci_devices' on Instance uuid 5722ce9b-957b-4a66-bb52-a0f639736797 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:12.664 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:13.077 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:13.078 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:13.079 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "store_auto_disk_config" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:13.093 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:13.094 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Injecting hostname (tempest-listserverfilterstestjson-instance-1930897350) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:35:13.095 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:13.107 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" released by "update_hostname" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:13.108 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:35:13.108 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:13.395 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" released by "update_nwinfo" :: held 0.287s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:13.396 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:14.073 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:35:14.081 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:35:14.090 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Created VIF OpaqueRef:396c088e-7a8f-a914-91d7-f837e3f13c41, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:35:14.091 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:14.382 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:35:15.399 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:17.794 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXqCWTZ/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 16.259s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:17.797 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:18.801 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.004s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:18.805 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:35:18.806 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:21.537 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp16fbc4/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 17.089s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:21.539 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:22.043 DEBUG oslo_concurrency.processutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.504s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:22.048 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:35:23.413 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4ev_mR/configdrive of=/dev/xvdf oflag=direct,sync" returned: 0 in 12.430s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:23.422 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:23.949 DEBUG oslo_concurrency.processutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.527s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:23.954 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:35:24.554 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 5.748s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:24.557 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 2.507s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:24.572 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:35:24.573 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:3ea0bf04-0860-3c75-160e-ef37bbd15bb7, VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:24.586 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:a877e329-6f4d-dae2-352a-b415c4ce5107 for VM OpaqueRef:3ea0bf04-0860-3c75-160e-ef37bbd15bb7, VDI OpaqueRef:29ce240e-5beb-b8c8-b738-06f6eced9a7b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:24.588 DEBUG nova.objects.instance [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `pci_devices' on Instance uuid e9775bc6-34f1-465c-9ea5-54d4b3d5a076 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:24.736 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:24.980 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:25.010 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:25.012 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "store_meta" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:25.013 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:25.024 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:25.025 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Injecting hostname (tempest-multiple-create-test-1860282004-1) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:35:25.026 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:25.043 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "update_hostname" :: held 0.017s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:25.044 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:35:25.044 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:25.390 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "update_nwinfo" :: held 0.346s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:25.391 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:25.633 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:35:25.648 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:35:25.656 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Created VIF OpaqueRef:da4b47fa-7515-d57a-17b1-f6a1f75a6f6b, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:35:25.657 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:25.914 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.358s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:25.916 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 1.960s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:25.921 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:35:25.932 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:35:25.932 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:ae48cb5b-edd0-e9cd-5f0c-229f29d2e43d, VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:25.967 DEBUG nova.virt.xenapi.vm_utils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:0382fdd0-71b3-bf89-3abe-789f0f028f41 for VM OpaqueRef:ae48cb5b-edd0-e9cd-5f0c-229f29d2e43d, VDI OpaqueRef:11ccad4d-9644-cd8d-613f-5c2b5abfe22e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:25.968 DEBUG nova.objects.instance [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid ca1ed49f-ef0a-4a98-b04f-6c956be89835 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:26.086 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:26.423 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:26.423 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:26.424 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:26.458 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "store_auto_disk_config" :: held 0.034s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:26.458 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Injecting hostname (tempest.common.compute-instance-121753128) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:35:26.459 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:26.493 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "update_hostname" :: held 0.034s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:26.494 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:35:26.495 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:26.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:26.524 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 17:35:26.950 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "update_nwinfo" :: held 0.455s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:26.951 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:27.170 ERROR oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Error during ComputeManager._poll_bandwidth_usage 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py", line 218, in run_periodic_tasks 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task task(self, context) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/compute/manager.py", line 5680, in _poll_bandwidth_usage 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task update_cells=update_cells) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 195, in wrapper 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task ctxt, self, fn.__name__, args, kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/rpcapi.py", line 248, in object_action 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task objmethod=objmethod, args=args, kwargs=kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task retry=self.retry) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task timeout=timeout, retry=retry) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 431, in send 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task retry=retry) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 422, in _send 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task raise result 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/manager.py", line 442, in _object_dispatch 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task return getattr(target, method)(*args, **kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 211, in wrapper 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task return fn(self, *args, **kwargs) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 69, in create 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task self._from_db_object(self._context, self, db_bw_usage) 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 42, in _from_db_object 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task bw_usage[field] = db_bw_usage['uuid'] 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.170 13318 ERROR oslo_service.periodic_task 2015-08-07 17:35:27.172 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.35 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:27.247 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:35:27.255 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:35:27.265 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Created VIF OpaqueRef:1bc65653-fcdb-ce0b-9ca0-518a5365a745, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:35:27.266 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:27.514 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:35:27.664 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.748s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:27.687 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:35:27.688 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:96ba22ab-9534-884f-7f1b-4485754698fa, VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:27.712 DEBUG nova.virt.xenapi.vm_utils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:0871b153-3516-2e37-9dea-e6d1100b8285 for VM OpaqueRef:96ba22ab-9534-884f-7f1b-4485754698fa, VDI OpaqueRef:0a182d62-8bdc-d1ae-cfdc-0a88e0efb528. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:27.713 DEBUG nova.objects.instance [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `pci_devices' on Instance uuid 962e3fc3-68c5-4019-beb9-2b9f939eb511 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:27.915 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:28.403 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:28.404 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:28.405 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:28.416 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:28.416 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Injecting hostname (tempest-multiple-create-test-1860282004-2) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:35:28.417 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:28.432 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "update_hostname" :: held 0.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:28.433 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:35:28.433 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:28.722 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "update_nwinfo" :: held 0.289s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:28.723 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:29.007 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:35:29.023 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:35:29.032 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Created VIF OpaqueRef:9cef43ce-6d6c-a458-f280-dbb48cd84fad, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:35:29.034 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:29.286 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:35:32.678 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:35:32.771 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:33.179 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:35:33.180 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:35:33.180 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:33.186 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-5722ce9b-957b-4a66-bb52-a0f639736797" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:33.187 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:33.527 DEBUG nova.virt.xenapi.vmops [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:34.009 DEBUG nova.compute.manager [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:35:34.444 DEBUG oslo_concurrency.lockutils [req-1ea9cc01-b0e0-4324-8377-4b0f66218579 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" released by "_locked_do_build_and_run_instance" :: held 52.880s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:35.011 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:35.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:35.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:35.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:35:35.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:35:35.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:35:35.610 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:35:35.611 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:35:35.612 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:35:35.613 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid f5ce5302-4adb-4ac6-be2f-b4371f1d3b62 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:35.954 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6a:1d:d3', 'active': False, 'type': u'bridge', 'id': u'7a7699f1-6340-4bf7-8a9a-8756937921d4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:35:36.013 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:35:36.014 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:35:36.014 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.50 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:36.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:36.606 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:36.614 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:36.615 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:37.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:37.513 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:37.766 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:37.875 INFO nova.compute.manager [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Starting instance... 2015-08-07 17:35:38.118 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:38.119 DEBUG nova.compute.resource_tracker [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Memory overhead for 128 MB instance; 6 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:35:38.128 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:35:38.129 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:35:38.130 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:35:38.130 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:35:38.131 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] disk limit not specified, defaulting to unlimited 2015-08-07 17:35:38.154 DEBUG nova.compute.resources.vcpu [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:35:38.155 DEBUG nova.compute.resources.vcpu [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:35:38.155 INFO nova.compute.claims [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Claim successful 2015-08-07 17:35:38.717 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "instance_claim" :: held 0.599s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:39.221 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:35:39.304 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:39.339 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:39.611 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.272s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:39.612 DEBUG nova.compute.utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:35:39.616 13318 DEBUG nova.compute.manager [-] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:35:39.618 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-c864db01-8fad-4d62-9c8d-652d30dbaf6e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:35:40.359 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:35:40.360 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:35:40.360 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:40.364 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:40.371 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:40.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:40.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:35:40.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:40.800 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:35:40.837 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:40.984 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:35:41.028 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:35:41.028 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:41.074 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:41.393 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:35:41.393 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:35:41.398 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:41.405 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:41.406 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:41.569 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:35:41.587 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:41.794 DEBUG nova.compute.manager [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:35:42.209 DEBUG nova.virt.xenapi.vmops [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:43.080 DEBUG nova.compute.manager [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:35:43.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:43.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:43.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:44.044 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "_locked_do_build_and_run_instance" :: held 56.842s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:45.339 DEBUG oslo_concurrency.lockutils [req-926956e4-7ad1-4cc9-b98f-62ac7c3fe0fd tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "ca1ed49f-ef0a-4a98-b04f-6c956be89835" released by "_locked_do_build_and_run_instance" :: held 58.681s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:45.794 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:35:45.860 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:45.962 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:46.747 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:35:46.748 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:35:46.748 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:46.753 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:46.757 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:46.905 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Cloned VDI OpaqueRef:507814bd-9f6b-fbc2-a8c7-8c59bafe5b75 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:35:46.916 13318 DEBUG nova.network.base_api [-] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7f:1c:d9', 'active': False, 'type': u'bridge', 'id': u'935bc641-13ca-43d5-8c89-9e679906d3eb', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:35:46.945 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-c864db01-8fad-4d62-9c8d-652d30dbaf6e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:35:46.953 13318 DEBUG nova.compute.manager [-] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.11'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7f:1c:d9', 'active': False, 'type': u'bridge', 'id': u'935bc641-13ca-43d5-8c89-9e679906d3eb', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:35:46.967 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:35:47.264 DEBUG nova.virt.xenapi.vmops [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:47.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:47.558 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:35:47.559 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:35:47.653 DEBUG nova.compute.manager [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:35:48.090 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:48.091 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:35:48.397 DEBUG oslo_concurrency.lockutils [req-d6d25f44-3bdb-4506-a738-1edec5cd00d8 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "_locked_do_build_and_run_instance" :: held 60.967s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:48.451 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 6.864s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:48.452 INFO nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Image creation data, cacheable: True, downloaded: False duration: 6.88 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:35:48.944 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.854s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:49.276 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -3 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:35:49.276 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:35:49.277 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=654MB free_disk=16GB free_vcpus=-3 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:35:49.277 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:49.618 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:49.825 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 11 2015-08-07 17:35:49.826 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=991MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=11 pci_stats=None 2015-08-07 17:35:49.930 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:35:49.931 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.653s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:49.932 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:49.990 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:50.226 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:35:50.237 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:35:50.238 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:50.572 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:de19843c-a5e8-5029-f9f2-cce6fc8bf95f, VDI OpaqueRef:507814bd-9f6b-fbc2-a8c7-8c59bafe5b75 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:50.583 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:8e3e4d67-02d6-8e06-b084-6944581dab15 for VM OpaqueRef:de19843c-a5e8-5029-f9f2-cce6fc8bf95f, VDI OpaqueRef:507814bd-9f6b-fbc2-a8c7-8c59bafe5b75. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:50.978 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:35:51.013 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:51.014 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:35:51.075 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:35:51.081 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:35:51.096 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:64055c41-2004-2a63-72aa-51421ef54423 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:35:51.096 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:64055c41-2004-2a63-72aa-51421ef54423 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:35:51.097 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:52.278 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:52.360 INFO nova.compute.manager [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Starting instance... 2015-08-07 17:35:52.439 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:52.487 INFO nova.compute.manager [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Starting instance... 2015-08-07 17:35:52.605 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:52.606 DEBUG nova.compute.resource_tracker [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:35:52.615 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:35:52.616 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:35:52.616 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:35:52.616 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:35:52.617 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] disk limit not specified, defaulting to unlimited 2015-08-07 17:35:52.645 DEBUG nova.compute.resources.vcpu [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:35:52.646 DEBUG nova.compute.resources.vcpu [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:35:52.647 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Claim successful 2015-08-07 17:35:52.986 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.889s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:52.987 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Plugging VBD OpaqueRef:64055c41-2004-2a63-72aa-51421ef54423 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:35:52.992 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VBD OpaqueRef:64055c41-2004-2a63-72aa-51421ef54423 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:35:53.027 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "instance_claim" :: held 0.422s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:53.035 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "instance_claim" :: waited 0.222s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:53.036 DEBUG nova.compute.resource_tracker [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:35:53.046 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:35:53.047 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Total memory: 8187 MB, used: 1060.00 MB 2015-08-07 17:35:53.047 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] memory limit: 12280.50 MB, free: 11220.50 MB 2015-08-07 17:35:53.048 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:35:53.048 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] disk limit not specified, defaulting to unlimited 2015-08-07 17:35:53.071 DEBUG nova.compute.resources.vcpu [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Total CPUs: 8 VCPUs, used: 7.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:35:53.071 DEBUG nova.compute.resources.vcpu [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:35:53.072 INFO nova.compute.claims [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Claim successful 2015-08-07 17:35:53.120 WARNING nova.virt.configdrive [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:35:53.120 DEBUG nova.objects.instance [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `ec2_ids' on Instance uuid c864db01-8fad-4d62-9c8d-652d30dbaf6e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:35:53.163 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): genisoimage -o /tmp/tmpfkSJ0u/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpALUYy8 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:53.343 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "genisoimage -o /tmp/tmpfkSJ0u/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpALUYy8" returned: 0 in 0.180s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:35:53.357 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpfkSJ0u/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:35:53.537 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 2.524s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:53.552 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 52c67121-63c2-459f-b10e-0a277b14d2e3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.579 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD c75bd30e-eac5-4d9e-a302-6fa5445e73e0 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.631 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD ad940792-9883-4f39-b3cc-49b2c22af740 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.669 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 481069f8-fd98-480b-adbb-a245fcbf3cd9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.678 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 52c67121-63c2-459f-b10e-0a277b14d2e3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.685 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD f338fd43-60de-4395-9f67-81cc26a7bb64 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.696 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 333d2ec5-fd6a-4068-9c7d-70d2731640da has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.712 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:53.726 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "instance_claim" :: held 0.691s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:53.737 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:35:53.738 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.385s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:53.930 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:53.932 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:53.938 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.199s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:53.939 DEBUG nova.compute.utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:35:53.944 13318 DEBUG nova.compute.manager [-] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:35:53.946 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-5ac604dd-d88a-4813-8a3b-bd37a907eee7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:35:54.197 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:54.362 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.165s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:54.363 DEBUG nova.compute.utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:35:54.367 13318 DEBUG nova.compute.manager [-] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:35:54.368 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:35:55.145 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:35:55.151 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:35:55.177 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:35:55.178 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:55.594 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:35:55.612 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:35:55.625 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:35:55.626 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:35:55.634 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:56.156 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:35:57.006 13318 DEBUG nova.network.base_api [-] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7e:db:5f', 'active': False, 'type': u'bridge', 'id': u'ed513ab2-db5f-464c-87f4-d17c64e7778a', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:35:57.039 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-5ac604dd-d88a-4813-8a3b-bd37a907eee7" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:35:57.039 13318 DEBUG nova.compute.manager [-] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.12'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7e:db:5f', 'active': False, 'type': u'bridge', 'id': u'ed513ab2-db5f-464c-87f4-d17c64e7778a', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:35:58.266 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:35:58.269 13318 DEBUG nova.network.base_api [-] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.13'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:f3:6a:40', 'active': False, 'type': u'bridge', 'id': u'b80c87b9-5684-426a-9cd1-6d4481e903f9', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:35:58.281 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:35:58.282 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:35:58.299 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:35:58.300 13318 DEBUG nova.compute.manager [-] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.13'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:f3:6a:40', 'active': False, 'type': u'bridge', 'id': u'b80c87b9-5684-426a-9cd1-6d4481e903f9', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:35:58.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:35:58.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:35:58.742 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 8 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:35:58.743 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5fe04666-4b1b-41cf-bca7-ce6c2b277477] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:35:59.275 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1ed8c9a4-5157-4b0b-af78-9a5d74576f17] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:35:59.316 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.035s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:35:59.329 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD bab22223-de0f-40d2-af31-9a026492069a has parent d9fd8ca0-ee67-4d16-a450-a1ecc891f4ff _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:59.345 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD d9fd8ca0-ee67-4d16-a450-a1ecc891f4ff has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:35:59.559 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1b9e1564-c8e1-4966-8922-0b8bbf38f0f8] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:35:59.607 DEBUG nova.virt.xenapi.client.session [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:35:59.758 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 47eca748-3e73-4f2b-80e9-a3e058bbd8a4] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:36:00.077 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: d28988ae-6848-475f-b43e-bc63166c3c1e] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:36:00.457 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e4c599da-3cac-4bfb-a1b5-2b8828fdb0b9] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:36:00.746 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8f22b72a-a408-4796-8637-4dedc84a367a] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:36:00.993 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8e5ee78c-20e2-4483-ab75-b109bb2fdca6] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:36:01.314 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 36.20 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:03.893 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpfkSJ0u/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.536s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:03.894 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:04.688 DEBUG oslo_concurrency.processutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.793s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:04.692 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:36:04.694 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:04.976 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:05.377 DEBUG nova.virt.xenapi.vmops [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Finished snapshot and upload for VM, duration: 14.40 secs for image 859fd1b8-ca83-4f0d-b1ce-f501c74ff255 snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:36:05.379 DEBUG nova.compute.manager [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:05.668 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.974s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:05.676 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Destroying VBD for VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:36:05.677 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Creating disk-type VBD for VM OpaqueRef:de19843c-a5e8-5029-f9f2-cce6fc8bf95f, VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:05.678 WARNING nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] VM already halted, skipping shutdown... 2015-08-07 17:36:05.679 DEBUG nova.compute.manager [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:05.689 DEBUG nova.virt.xenapi.vm_utils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Created VBD OpaqueRef:e3c8e33a-85c6-1b43-ee84-b6e327128f49 for VM OpaqueRef:de19843c-a5e8-5029-f9f2-cce6fc8bf95f, VDI OpaqueRef:c52fd62e-cd7b-7729-f0c5-e894cdd28f0b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:05.690 DEBUG nova.objects.instance [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `pci_devices' on Instance uuid c864db01-8fad-4d62-9c8d-652d30dbaf6e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:05.892 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:06.098 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Acquired semaphore "refresh_cache-ca1ed49f-ef0a-4a98-b04f-6c956be89835" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:36:06.157 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:06.158 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:06.159 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:06.168 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:06.169 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Injecting hostname (tempest-listserverfilterstestjson-instance-906881881) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:36:06.169 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:06.178 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:06.179 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:36:06.179 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:06.302 DEBUG nova.network.base_api [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6f:37:5e', 'active': False, 'type': u'bridge', 'id': u'e6c4302b-7154-4e57-9602-6a296335599b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:36:06.343 DEBUG oslo_concurrency.lockutils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Releasing semaphore "refresh_cache-ca1ed49f-ef0a-4a98-b04f-6c956be89835" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:36:06.393 INFO nova.virt.xenapi.vmops [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Destroying VM 2015-08-07 17:36:06.396 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "update_nwinfo" :: held 0.216s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:06.396 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:06.417 WARNING nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] VM already halted, skipping shutdown... 2015-08-07 17:36:06.429 DEBUG nova.virt.xenapi.vmops [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:36:06.440 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 52c67121-63c2-459f-b10e-0a277b14d2e3 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:36:06.455 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 5ae80de5-8054-463c-b0ed-af0b9b4b0980 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:36:06.716 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:36:06.752 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:36:06.766 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Created VIF OpaqueRef:8fdf251d-5aad-ab1a-a19f-09e7d1461476, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:36:06.766 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:07.059 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:36:07.536 DEBUG nova.virt.xenapi.vmops [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:36:07.551 DEBUG nova.virt.xenapi.vm_utils [req-d91b8695-d7fc-4bab-af77-6f6ad014834d tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: ca1ed49f-ef0a-4a98-b04f-6c956be89835] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:36:09.104 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Cloned VDI OpaqueRef:1b646b7f-67b6-5b91-e48a-93a0826adb3b from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:36:10.057 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 14.423s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:10.058 INFO nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Image creation data, cacheable: True, downloaded: False duration: 14.45 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:36:10.059 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 13.890s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:11.517 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Cloned VDI OpaqueRef:419b0b68-40bd-0bed-472d-e1f59e774ecd from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:36:11.745 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:12.097 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:12.360 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:36:12.376 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:36:12.377 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:12.416 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.356s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:12.421 INFO nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Image creation data, cacheable: True, downloaded: False duration: 16.26 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:36:12.679 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:3d0fb519-9c8a-a2fa-d15f-ce53142b25cc, VDI OpaqueRef:1b646b7f-67b6-5b91-e48a-93a0826adb3b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:12.688 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:67fdb1a5-75fe-25a4-5d6f-132a80f34fd9 for VM OpaqueRef:3d0fb519-9c8a-a2fa-d15f-ce53142b25cc, VDI OpaqueRef:1b646b7f-67b6-5b91-e48a-93a0826adb3b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:13.399 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:36:13.403 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:13.417 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:00748fa0-55a2-96ab-3875-883645df58c9 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:13.418 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:00748fa0-55a2-96ab-3875-883645df58c9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:36:13.418 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:13.604 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:13.921 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:14.253 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:36:14.277 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:36:14.277 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:15.069 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:15.121 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:b6f24efe-f5c4-008f-b735-45e1996b5d27, VDI OpaqueRef:419b0b68-40bd-0bed-472d-e1f59e774ecd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:15.135 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:15ea9bf5-ae71-e116-19b6-c1a7933bff0a for VM OpaqueRef:b6f24efe-f5c4-008f-b735-45e1996b5d27, VDI OpaqueRef:419b0b68-40bd-0bed-472d-e1f59e774ecd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:15.983 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:36:15.996 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:16.014 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:ca294090-94ea-e641-dca7-c0f8734c033d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:16.014 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:ca294090-94ea-e641-dca7-c0f8734c033d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:36:16.269 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.850s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:16.269 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:00748fa0-55a2-96ab-3875-883645df58c9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:36:16.270 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.255s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:16.282 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VBD OpaqueRef:00748fa0-55a2-96ab-3875-883645df58c9 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:36:16.322 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:16.388 WARNING nova.virt.configdrive [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:36:16.389 DEBUG nova.objects.instance [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `ec2_ids' on Instance uuid 5ac604dd-d88a-4813-8a3b-bd37a907eee7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:16.409 INFO nova.compute.manager [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Starting instance... 2015-08-07 17:36:16.432 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): genisoimage -o /tmp/tmp1JuBSM/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp8XWjSX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:16.717 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "genisoimage -o /tmp/tmp1JuBSM/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp8XWjSX" returned: 0 in 0.285s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:16.724 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp1JuBSM/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:16.909 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:16.911 DEBUG nova.compute.resource_tracker [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:36:16.921 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:36:16.922 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Total memory: 8187 MB, used: 1129.00 MB 2015-08-07 17:36:16.922 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] memory limit: 12280.50 MB, free: 11151.50 MB 2015-08-07 17:36:16.923 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:36:16.923 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] disk limit not specified, defaulting to unlimited 2015-08-07 17:36:16.948 DEBUG nova.compute.resources.vcpu [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 8.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:36:16.949 DEBUG nova.compute.resources.vcpu [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:36:16.949 INFO nova.compute.claims [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Claim successful 2015-08-07 17:36:17.681 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.772s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:17.980 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:36:18.010 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:18.059 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:18.392 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.333s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:18.394 DEBUG nova.compute.utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:36:18.398 13318 DEBUG nova.compute.manager [-] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:36:18.400 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-6b6cb831-76dd-4428-ab66-c020997bc153" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:36:18.425 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:36:18.426 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:36:18.427 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:18.634 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "xenstore-c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "update_hostname" :: held 0.207s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:18.635 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:19.223 DEBUG nova.virt.xenapi.vmops [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:19.515 DEBUG nova.compute.manager [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:19.520 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:36:19.564 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:36:19.564 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:19.870 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.600s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:19.871 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Plugging VBD OpaqueRef:ca294090-94ea-e641-dca7-c0f8734c033d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:36:19.878 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VBD OpaqueRef:ca294090-94ea-e641-dca7-c0f8734c033d plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:36:20.001 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:36:20.040 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:20.049 WARNING nova.virt.configdrive [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:36:20.052 DEBUG nova.objects.instance [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `ec2_ids' on Instance uuid 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:20.120 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): genisoimage -o /tmp/tmpAWEF1J/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRXU1Md execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:20.298 DEBUG oslo_concurrency.lockutils [req-d019d9ea-acd9-4753-8cda-9de0536965d0 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "_locked_do_build_and_run_instance" :: held 42.532s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:20.379 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "genisoimage -o /tmp/tmpAWEF1J/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRXU1Md" returned: 0 in 0.259s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:20.383 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpAWEF1J/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:22.083 13318 DEBUG nova.network.base_api [-] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d9:f1:f2', 'active': False, 'type': u'bridge', 'id': u'08d3460b-1516-42ae-bf85-a866a4828b0d', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:36:22.123 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-6b6cb831-76dd-4428-ab66-c020997bc153" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:36:22.124 13318 DEBUG nova.compute.manager [-] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d9:f1:f2', 'active': False, 'type': u'bridge', 'id': u'08d3460b-1516-42ae-bf85-a866a4828b0d', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:36:23.053 DEBUG oslo_concurrency.lockutils [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:23.054 DEBUG nova.compute.manager [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:23.073 DEBUG nova.compute.manager [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:36:23.082 DEBUG nova.virt.xenapi.vm_utils [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:36:23.960 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:5f334e83-b2bf-a96f-6e05-6dacd9048cbc from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:36:24.983 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:27.874 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 7.833s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:27.875 INFO nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 7.87 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:36:30.684 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:31.192 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:31.440 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:36:31.452 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:36:31.453 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:31.724 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp1JuBSM/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 15.000s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:31.742 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:31.876 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:d81bd2dd-d2f3-9a2f-2083-66708d2b5c74, VDI OpaqueRef:5f334e83-b2bf-a96f-6e05-6dacd9048cbc ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:31.890 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:4b322421-8d85-40e8-8e41-4f99b518de5a for VM OpaqueRef:d81bd2dd-d2f3-9a2f-2083-66708d2b5c74, VDI OpaqueRef:5f334e83-b2bf-a96f-6e05-6dacd9048cbc. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:32.247 DEBUG nova.compute.manager [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:32.506 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.764s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:32.507 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:36:32.508 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:32.636 DEBUG oslo_concurrency.lockutils [req-feda2467-3374-49a7-a38a-c85129ab9007 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "do_stop_instance" :: held 9.583s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:33.727 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:36:33.741 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:33.755 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:b05d624f-2799-d27b-a413-e37d37e63bf1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:33.756 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:b05d624f-2799-d27b-a413-e37d37e63bf1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:36:33.933 DEBUG oslo_concurrency.lockutils [req-1a5e92ea-a4c2-4474-9c7b-ee1e785fb6e7 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Acquired semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:36:34.122 DEBUG nova.network.base_api [req-1a5e92ea-a4c2-4474-9c7b-ee1e785fb6e7 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:6a:1d:d3', 'active': False, 'type': u'bridge', 'id': u'7a7699f1-6340-4bf7-8a9a-8756937921d4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:36:34.151 DEBUG oslo_concurrency.lockutils [req-1a5e92ea-a4c2-4474-9c7b-ee1e785fb6e7 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Releasing semaphore "refresh_cache-f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:36:34.189 DEBUG nova.virt.xenapi.vmops [req-1a5e92ea-a4c2-4474-9c7b-ee1e785fb6e7 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:36:35.002 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:35.047 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.539s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:35.049 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 1.292s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:35.082 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:36:35.083 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:3d0fb519-9c8a-a2fa-d15f-ce53142b25cc, VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:35.101 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:f89cbdee-e4b2-1952-2335-58d9f36a7312 for VM OpaqueRef:3d0fb519-9c8a-a2fa-d15f-ce53142b25cc, VDI OpaqueRef:2867d912-ebd1-5276-0014-8cf8807f0b8e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:35.102 DEBUG nova.objects.instance [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `pci_devices' on Instance uuid 5ac604dd-d88a-4813-8a3b-bd37a907eee7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:35.249 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:35.558 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:35.559 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:35.559 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:35.593 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "store_auto_disk_config" :: held 0.033s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:35.593 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Injecting hostname (tempest-multiple-create-test-1434196113-1) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:36:35.594 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:35.605 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:35.606 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:36:35.607 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:36.151 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "update_nwinfo" :: held 0.545s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:36.152 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:36.312 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpAWEF1J/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 15.929s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:36.314 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:36.505 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:36:36.529 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:36:36.587 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Created VIF OpaqueRef:9c128167-30d7-eb33-c5be-066084247c23, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:36:36.587 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:36.935 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:36:36.937 DEBUG oslo_concurrency.processutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.623s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:36.938 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:36:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:37.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:37.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:37.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:36:37.657 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-5722ce9b-957b-4a66-bb52-a0f639736797" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:36:37.880 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:e0:2f:08', 'active': False, 'type': u'bridge', 'id': u'68b7a8ad-49fa-46ce-a873-dfe17b6bae9c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:36:37.907 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-5722ce9b-957b-4a66-bb52-a0f639736797" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:36:37.908 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:36:37.908 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:38.432 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.384s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:38.433 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:b05d624f-2799-d27b-a413-e37d37e63bf1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:36:38.435 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 1.496s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:38.440 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:b05d624f-2799-d27b-a413-e37d37e63bf1 plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:36:38.559 WARNING nova.virt.configdrive [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:36:38.560 DEBUG nova.objects.instance [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid 6b6cb831-76dd-4428-ab66-c020997bc153 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:38.612 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmpXQtjHC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpk0Wz_b execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:38.984 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmpXQtjHC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpk0Wz_b" returned: 0 in 0.371s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:38.990 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXQtjHC/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:39.900 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:39.905 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.62 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:41.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:41.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:36:41.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:42.632 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 4.198s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:42.750 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Destroying VBD for VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:36:42.751 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Creating disk-type VBD for VM OpaqueRef:b6f24efe-f5c4-008f-b735-45e1996b5d27, VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:42.766 DEBUG nova.virt.xenapi.vm_utils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Created VBD OpaqueRef:2c479af3-8ddb-ac95-631c-caa40755d7e4 for VM OpaqueRef:b6f24efe-f5c4-008f-b735-45e1996b5d27, VDI OpaqueRef:5c5c397b-ab79-159e-4e62-1a8c5f647fde. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:42.767 DEBUG nova.objects.instance [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `pci_devices' on Instance uuid 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:42.924 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:43.394 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:43.395 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:43.395 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:43.407 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:43.408 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Injecting hostname (tempest-multiple-create-test-1434196113-2) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:36:43.409 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:43.418 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:43.419 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:36:43.419 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:44.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:44.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:44.830 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "update_nwinfo" :: held 1.410s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:44.831 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:44.979 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:45.083 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:36:45.092 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:36:45.100 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Created VIF OpaqueRef:90d6f872-068a-bc47-c7d3-62794e3eaf98, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:36:45.101 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:45.402 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:36:45.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:45.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:46.494 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:46.699 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 8 instances in the database and 5 instances on the hypervisor. 2015-08-07 17:36:46.700 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid f5ce5302-4adb-4ac6-be2f-b4371f1d3b62 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.701 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 5722ce9b-957b-4a66-bb52-a0f639736797 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.701 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid e9775bc6-34f1-465c-9ea5-54d4b3d5a076 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.702 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 962e3fc3-68c5-4019-beb9-2b9f939eb511 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.702 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid c864db01-8fad-4d62-9c8d-652d30dbaf6e _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.703 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 5ac604dd-d88a-4813-8a3b-bd37a907eee7 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.703 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.703 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 6b6cb831-76dd-4428-ab66-c020997bc153 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:36:46.704 13318 DEBUG oslo_concurrency.lockutils [-] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:46.705 13318 INFO nova.compute.manager [-] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] During sync_power_state the instance has a pending task (powering-on). Skip. 2015-08-07 17:36:46.705 13318 DEBUG oslo_concurrency.lockutils [-] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:46.706 13318 DEBUG oslo_concurrency.lockutils [-] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:46.707 13318 DEBUG oslo_concurrency.lockutils [-] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:46.707 13318 DEBUG oslo_concurrency.lockutils [-] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:46.709 13318 DEBUG oslo_concurrency.lockutils [-] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "query_driver_power_state_and_sync" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:46.710 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:47.026 13318 DEBUG oslo_concurrency.lockutils [-] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "query_driver_power_state_and_sync" :: held 0.319s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:47.042 13318 DEBUG oslo_concurrency.lockutils [-] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "query_driver_power_state_and_sync" :: held 0.335s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:47.049 13318 DEBUG oslo_concurrency.lockutils [-] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" released by "query_driver_power_state_and_sync" :: held 0.344s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:47.183 13318 DEBUG oslo_concurrency.lockutils [-] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "query_driver_power_state_and_sync" :: held 0.475s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:47.779 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:47.848 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:36:47.849 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:36:49.043 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:49.044 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:36:50.388 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.345s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:50.920 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:36:50.921 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:36:50.921 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=456MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:36:50.922 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:51.495 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:36:51.496 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=1129MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:36:51.654 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:36:51.655 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.733s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:51.655 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.74 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:53.328 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXQtjHC/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 14.338s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:53.330 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:36:54.459 DEBUG oslo_concurrency.processutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.129s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:36:54.464 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:36:54.466 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:54.997 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:55.576 DEBUG nova.compute.manager [req-1a5e92ea-a4c2-4474-9c7b-ee1e785fb6e7 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:57.396 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:36:57.397 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 40.12 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:36:57.709 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.243s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:57.718 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:36:57.719 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:d81bd2dd-d2f3-9a2f-2083-66708d2b5c74, VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:36:57.740 DEBUG nova.virt.xenapi.vm_utils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:6a7b2785-397a-5565-8ac2-10d635888424 for VM OpaqueRef:d81bd2dd-d2f3-9a2f-2083-66708d2b5c74, VDI OpaqueRef:f1428073-bd2a-2f00-f95d-432ca6af043e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:36:57.741 DEBUG nova.objects.instance [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid 6b6cb831-76dd-4428-ab66-c020997bc153 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:36:57.895 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:57.927 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:36:58.013 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:58.384 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:58.385 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:58.385 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:58.398 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" released by "store_auto_disk_config" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:58.400 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Injecting hostname (tempest.common.compute-instance-2112084964) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:36:58.401 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:58.410 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:58.411 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:36:58.412 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:58.455 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:36:58.455 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:36:58.456 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:58.465 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:58.465 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:58.942 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" released by "update_nwinfo" :: held 0.531s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:58.943 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:58.987 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:59.450 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:36:59.468 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:36:59.478 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Created VIF OpaqueRef:c0bb9239-a927-56b0-7385-b35fa248468e, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:36:59.480 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:36:59.574 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:59.575 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:36:59.576 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:36:59.578 INFO nova.compute.manager [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Terminating instance 2015-08-07 17:36:59.581 INFO nova.virt.xenapi.vmops [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Destroying VM 2015-08-07 17:36:59.599 DEBUG nova.virt.xenapi.vm_utils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:36:59.798 DEBUG nova.compute.manager [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:36:59.906 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:37:00.403 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:00.404 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:00.405 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:00.406 INFO nova.compute.manager [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Terminating instance 2015-08-07 17:37:00.506 INFO nova.virt.xenapi.vmops [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Destroying VM 2015-08-07 17:37:00.572 DEBUG nova.virt.xenapi.vm_utils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:00.613 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "_locked_do_build_and_run_instance" :: held 68.335s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:00.614 13318 DEBUG oslo_concurrency.lockutils [-] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "query_driver_power_state_and_sync" :: waited 13.905s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:00.615 13318 INFO nova.compute.manager [-] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:37:00.615 13318 DEBUG oslo_concurrency.lockutils [-] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:01.567 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:01.568 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:01.568 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:01.570 INFO nova.compute.manager [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Terminating instance 2015-08-07 17:37:01.573 INFO nova.virt.xenapi.vmops [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Destroying VM 2015-08-07 17:37:01.627 DEBUG nova.virt.xenapi.vm_utils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:05.177 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:07.125 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:37:07.149 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:07.526 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:37:07.526 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:37:07.532 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:07.539 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "xenstore-717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:07.540 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:08.043 DEBUG nova.virt.xenapi.vmops [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:08.433 DEBUG nova.compute.manager [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:37:09.001 DEBUG oslo_concurrency.lockutils [req-bfded4dc-6b96-4800-91c6-d96cad32016a tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "_locked_do_build_and_run_instance" :: held 76.562s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:09.003 13318 DEBUG oslo_concurrency.lockutils [-] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "query_driver_power_state_and_sync" :: waited 22.294s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:09.003 13318 INFO nova.compute.manager [-] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:37:09.004 13318 DEBUG oslo_concurrency.lockutils [-] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:10.066 DEBUG nova.virt.xenapi.vmops [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:10.107 DEBUG nova.virt.xenapi.vm_utils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI 333d2ec5-fd6a-4068-9c7d-70d2731640da is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:10.149 DEBUG nova.virt.xenapi.vm_utils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI 8a812837-a6fe-445b-9815-e9bc29d3f3c4 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:11.485 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:11.486 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:11.486 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:11.488 INFO nova.compute.manager [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Terminating instance 2015-08-07 17:37:11.490 INFO nova.virt.xenapi.vmops [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Destroying VM 2015-08-07 17:37:11.508 DEBUG nova.virt.xenapi.vm_utils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:12.091 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:12.093 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:12.093 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:12.095 INFO nova.compute.manager [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Terminating instance 2015-08-07 17:37:12.097 INFO nova.virt.xenapi.vmops [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Destroying VM 2015-08-07 17:37:12.418 DEBUG nova.virt.xenapi.vmops [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:12.439 DEBUG nova.virt.xenapi.vm_utils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:12.439 DEBUG nova.compute.manager [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:12.502 DEBUG nova.virt.xenapi.vm_utils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:12.760 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:12.761 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:12.762 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:12.764 INFO nova.compute.manager [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Terminating instance 2015-08-07 17:37:12.765 INFO nova.virt.xenapi.vmops [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Destroying VM 2015-08-07 17:37:13.574 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:13.575 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:13.576 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:13.579 INFO nova.compute.manager [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Terminating instance 2015-08-07 17:37:13.582 INFO nova.virt.xenapi.vmops [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Destroying VM 2015-08-07 17:37:15.004 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:15.293 DEBUG nova.compute.manager [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:33:56Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=39,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=f5ce5302-4adb-4ac6-be2f-b4371f1d3b62,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:33:59Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:15.506 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:15.508 DEBUG nova.objects.instance [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `numa_topology' on Instance uuid f5ce5302-4adb-4ac6-be2f-b4371f1d3b62 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:15.659 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.152s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:16.355 DEBUG oslo_concurrency.lockutils [req-0c4983a6-c192-4691-8151-efac6e228dec tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "f5ce5302-4adb-4ac6-be2f-b4371f1d3b62" released by "do_terminate_instance" :: held 16.781s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:19.330 DEBUG nova.virt.xenapi.vmops [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:19.346 DEBUG nova.virt.xenapi.vm_utils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI 24aadfe8-2d1b-4bdf-8fd1-7dc8503f02a3 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:19.360 DEBUG nova.virt.xenapi.vm_utils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI f338fd43-60de-4395-9f67-81cc26a7bb64 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:19.490 DEBUG nova.virt.xenapi.vmops [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:19.521 DEBUG nova.virt.xenapi.vm_utils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI ad940792-9883-4f39-b3cc-49b2c22af740 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:19.566 DEBUG nova.virt.xenapi.vm_utils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 3fe1e4d8-faa3-4eda-933a-6c71835a93e8 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:20.794 DEBUG nova.virt.xenapi.vmops [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:20.819 DEBUG nova.virt.xenapi.vm_utils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:20.819 DEBUG nova.compute.manager [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:20.865 DEBUG nova.virt.xenapi.vm_utils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:22.510 DEBUG nova.virt.xenapi.vmops [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:22.543 DEBUG nova.virt.xenapi.vm_utils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 20d3b09d-4ce3-4b31-bb22-bd3c75773711 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:22.587 DEBUG nova.virt.xenapi.vm_utils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 481069f8-fd98-480b-adbb-a245fcbf3cd9 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:23.714 DEBUG nova.compute.manager [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:34:41Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=40,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=5722ce9b-957b-4a66-bb52-a0f639736797,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:34:43Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:24.115 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:24.116 DEBUG nova.objects.instance [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `numa_topology' on Instance uuid 5722ce9b-957b-4a66-bb52-a0f639736797 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:24.223 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.108s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:24.453 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:37:24.483 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:24.506 DEBUG nova.virt.xenapi.vm_utils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:24.557 DEBUG oslo_concurrency.lockutils [req-50ee7af6-7ad1-4a67-b923-0ec4cc8d9ec1 tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "5722ce9b-957b-4a66-bb52-a0f639736797" released by "do_terminate_instance" :: held 24.154s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:24.640 DEBUG nova.virt.xenapi.vmops [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:24.667 DEBUG nova.virt.xenapi.vm_utils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:24.667 DEBUG nova.compute.manager [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:24.783 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:37:24.783 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:37:24.784 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:24.795 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-6b6cb831-76dd-4428-ab66-c020997bc153" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:24.796 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:24.995 DEBUG nova.virt.xenapi.vmops [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:25.051 DEBUG nova.virt.xenapi.vm_utils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI c75bd30e-eac5-4d9e-a302-6fa5445e73e0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:25.069 DEBUG nova.virt.xenapi.vm_utils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] VDI e3b0dbee-ea67-42eb-901e-f6a2a90e75a2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:25.087 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:25.136 DEBUG nova.virt.xenapi.vmops [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:25.438 DEBUG nova.compute.manager [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:37:25.959 DEBUG oslo_concurrency.lockutils [req-d6e42899-63d5-4b95-87ca-f44003219653 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" released by "_locked_do_build_and_run_instance" :: held 69.637s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:25.960 13318 DEBUG oslo_concurrency.lockutils [-] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "query_driver_power_state_and_sync" :: waited 39.250s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:25.960 13318 INFO nova.compute.manager [-] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:37:25.960 13318 DEBUG oslo_concurrency.lockutils [-] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:26.287 DEBUG nova.virt.xenapi.vmops [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:26.301 DEBUG nova.virt.xenapi.vm_utils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:26.301 DEBUG nova.compute.manager [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:26.841 DEBUG oslo_concurrency.lockutils [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:26.842 DEBUG nova.compute.manager [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:37:26.853 DEBUG nova.compute.manager [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:37:26.874 DEBUG nova.compute.manager [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:34:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=42,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=e9775bc6-34f1-465c-9ea5-54d4b3d5a076,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:34:49Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:26.899 DEBUG nova.virt.xenapi.vm_utils [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:37:27.105 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:27.106 DEBUG nova.objects.instance [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `numa_topology' on Instance uuid e9775bc6-34f1-465c-9ea5-54d4b3d5a076 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:27.216 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.111s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:27.350 DEBUG nova.virt.xenapi.vmops [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:27.360 DEBUG nova.virt.xenapi.vm_utils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 695d1331-3fa1-4df7-8409-699bb850cbc4 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:27.375 DEBUG nova.virt.xenapi.vm_utils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 564e69a7-9903-4cf5-a822-39df84a7b0c2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:28.034 DEBUG oslo_concurrency.lockutils [req-c6802fe5-9366-4eb1-8b76-6c4f5320b445 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "e9775bc6-34f1-465c-9ea5-54d4b3d5a076" released by "do_terminate_instance" :: held 15.945s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:28.298 DEBUG nova.compute.manager [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:35:37Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=44,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=c864db01-8fad-4d62-9c8d-652d30dbaf6e,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:35:39Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:28.508 DEBUG nova.virt.xenapi.vmops [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:28.521 DEBUG nova.virt.xenapi.vm_utils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:28.522 DEBUG nova.compute.manager [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:28.691 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:28.692 DEBUG nova.objects.instance [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lazy-loading `numa_topology' on Instance uuid c864db01-8fad-4d62-9c8d-652d30dbaf6e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:28.784 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "compute_resources" released by "update_usage" :: held 0.093s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:29.152 DEBUG nova.virt.xenapi.vmops [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:29.169 DEBUG nova.virt.xenapi.vm_utils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI a8a00f3d-bb95-49cd-89c4-ffd859e8288a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:29.192 DEBUG nova.virt.xenapi.vm_utils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] VDI 05b1af68-73a2-4a31-9b27-0259fa5363ba is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:29.202 DEBUG oslo_concurrency.lockutils [req-7eb7d5c6-3048-4c4c-be8c-d6942da4f67a tempest-ListServerFiltersTestJSON-680606332 tempest-ListServerFiltersTestJSON-1559650729] Lock "c864db01-8fad-4d62-9c8d-652d30dbaf6e" released by "do_terminate_instance" :: held 27.635s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:29.982 DEBUG nova.virt.xenapi.vmops [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:29.994 DEBUG nova.virt.xenapi.vm_utils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:29.995 DEBUG nova.compute.manager [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:30.098 DEBUG nova.compute.manager [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:37:30.195 DEBUG nova.compute.manager [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:35:51Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=45,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=5ac604dd-d88a-4813-8a3b-bd37a907eee7,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:35:53Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:30.305 DEBUG oslo_concurrency.lockutils [req-b7dbe265-a351-4da9-988a-d259f2b8fdda tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" released by "do_stop_instance" :: held 3.464s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:30.409 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:30.412 DEBUG nova.objects.instance [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `numa_topology' on Instance uuid 5ac604dd-d88a-4813-8a3b-bd37a907eee7 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:30.526 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.117s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:30.856 DEBUG oslo_concurrency.lockutils [req-fe42cfa1-1906-485a-aa0d-02d743aff59d tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "5ac604dd-d88a-4813-8a3b-bd37a907eee7" released by "do_terminate_instance" :: held 17.282s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:31.322 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:31.323 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:31.324 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:31.326 INFO nova.compute.manager [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Terminating instance 2015-08-07 17:37:31.328 INFO nova.virt.xenapi.vmops [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Destroying VM 2015-08-07 17:37:31.339 WARNING nova.virt.xenapi.vm_utils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] VM already halted, skipping shutdown... 2015-08-07 17:37:31.349 DEBUG nova.virt.xenapi.vmops [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:37:31.358 DEBUG nova.virt.xenapi.vm_utils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 1fe2141f-0a6c-433d-a983-931c7854a779 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:31.370 DEBUG nova.virt.xenapi.vm_utils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI ab804555-a8f0-44da-86a1-aeb08ec2869b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:37:31.750 DEBUG nova.compute.manager [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:35:51Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=46,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=717eb8d1-816f-4ae1-9e4e-fecf5e9205ae,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:35:54Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:31.942 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:31.943 DEBUG nova.objects.instance [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `numa_topology' on Instance uuid 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:32.027 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.085s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:32.072 DEBUG nova.virt.xenapi.vmops [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:32.084 DEBUG nova.virt.xenapi.vm_utils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:32.085 DEBUG nova.compute.manager [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:32.304 DEBUG oslo_concurrency.lockutils [req-ce353d18-3a65-4ae4-accc-6535f33a7f26 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "717eb8d1-816f-4ae1-9e4e-fecf5e9205ae" released by "do_terminate_instance" :: held 19.544s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:33.652 DEBUG nova.compute.manager [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:36:15Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=47,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=6b6cb831-76dd-4428-ab66-c020997bc153,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:36:18Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:33.833 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:33.835 DEBUG nova.objects.instance [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid 6b6cb831-76dd-4428-ab66-c020997bc153 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:33.931 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.098s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:34.250 DEBUG oslo_concurrency.lockutils [req-fe80d669-6f78-4882-93ac-b259caac0d52 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "6b6cb831-76dd-4428-ab66-c020997bc153" released by "do_terminate_instance" :: held 2.927s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:34.990 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:35.197 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:35.274 INFO nova.compute.manager [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance... 2015-08-07 17:37:35.497 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:35.498 DEBUG nova.compute.resource_tracker [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:37:35.506 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:37:35.506 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:37:35.507 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:37:35.507 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:37:35.508 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] disk limit not specified, defaulting to unlimited 2015-08-07 17:37:35.528 DEBUG nova.compute.resources.vcpu [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:37:35.529 DEBUG nova.compute.resources.vcpu [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:37:35.529 INFO nova.compute.claims [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Claim successful 2015-08-07 17:37:35.835 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.337s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:36.039 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:36.121 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.081s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:36.122 DEBUG nova.compute.utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:37:36.127 13318 DEBUG nova.compute.manager [-] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:37:36.128 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:37:36.697 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:36.745 INFO nova.compute.manager [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Starting instance... 2015-08-07 17:37:36.985 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:37:36.999 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:37:37.000 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:37.030 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:37.031 DEBUG nova.compute.resource_tracker [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:37:37.039 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:37:37.040 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:37:37.040 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:37:37.041 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:37:37.044 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] disk limit not specified, defaulting to unlimited 2015-08-07 17:37:37.089 DEBUG nova.compute.resources.vcpu [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:37:37.089 DEBUG nova.compute.resources.vcpu [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:37:37.090 INFO nova.compute.claims [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Claim successful 2015-08-07 17:37:37.216 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:37:37.230 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:37.292 DEBUG nova.virt.xenapi.vmops [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:37:37.307 DEBUG nova.virt.xenapi.vm_utils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:37:37.308 DEBUG nova.compute.manager [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:37:37.396 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.366s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:37.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:37.676 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:37.797 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.122s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:37.798 DEBUG nova.compute.utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:37:37.804 13318 DEBUG nova.compute.manager [-] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:37:37.805 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:37:38.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:38.608 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:38.636 13318 DEBUG nova.network.base_api [-] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:37:38.688 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:37:38.688 13318 DEBUG nova.compute.manager [-] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:37:38.701 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:37:38.722 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:37:38.723 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:39.086 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:37:39.608 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:39.608 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:39.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:37:39.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:37:39.686 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:37:39.688 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:37:39.688 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:37:39.688 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:37:39.690 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:39.752 DEBUG nova.compute.manager [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:34:46Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=43,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=962e3fc3-68c5-4019-beb9-2b9f939eb511,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:34:50Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:37:39.983 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:39.984 DEBUG nova.objects.instance [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lazy-loading `numa_topology' on Instance uuid 962e3fc3-68c5-4019-beb9-2b9f939eb511 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:40.097 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "compute_resources" released by "update_usage" :: held 0.114s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:40.428 DEBUG oslo_concurrency.lockutils [req-50606ec8-a668-4b1f-9db5-7399bad42325 tempest-MultipleCreateTestJSON-1784764366 tempest-MultipleCreateTestJSON-1402821900] Lock "962e3fc3-68c5-4019-beb9-2b9f939eb511" released by "do_terminate_instance" :: held 28.943s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:40.650 13318 DEBUG nova.network.base_api [-] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a8:11:04', 'active': False, 'type': u'bridge', 'id': u'66477340-4466-4220-84c8-a9496f910453', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:37:40.679 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:37:40.680 13318 DEBUG nova.compute.manager [-] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a8:11:04', 'active': False, 'type': u'bridge', 'id': u'66477340-4466-4220-84c8-a9496f910453', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:37:41.392 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:e969628b-56e2-d41a-0fe2-1ea12447c0c5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:37:42.010 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.780s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:42.010 INFO nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 4.79 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:37:42.011 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 2.915s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:43.077 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:9d24c403-d5ff-477a-727f-a7af812eb0a3 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:37:43.176 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:43.408 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:43.602 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:43.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:37:43.604 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:43.677 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:37:43.688 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:37:43.689 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:43.721 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.710s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:43.722 INFO nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 4.64 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:37:43.932 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:cab06afa-6943-60a8-77cb-7aa635dc63cb, VDI OpaqueRef:e969628b-56e2-d41a-0fe2-1ea12447c0c5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:43.940 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:3b3e0116-ec77-7dd0-ebd1-8002aa9d6133 for VM OpaqueRef:cab06afa-6943-60a8-77cb-7aa635dc63cb, VDI OpaqueRef:e969628b-56e2-d41a-0fe2-1ea12447c0c5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:44.373 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:37:44.376 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:44.393 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:0e5e568c-0194-70cb-faf2-71518885b8f4 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:44.393 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:0e5e568c-0194-70cb-faf2-71518885b8f4 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:37:44.394 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:44.509 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:44.751 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:44.973 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:45.006 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:37:45.018 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:37:45.019 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:45.240 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:0761bfe0-2888-9816-5f27-1d443e70c099, VDI OpaqueRef:9d24c403-d5ff-477a-727f-a7af812eb0a3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:45.250 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:fb72fd05-c5f6-c2da-2e0f-951b58cf361f for VM OpaqueRef:0761bfe0-2888-9816-5f27-1d443e70c099, VDI OpaqueRef:9d24c403-d5ff-477a-727f-a7af812eb0a3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:45.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:45.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:45.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:45.640 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:37:45.645 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:45.658 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:56e11426-e8a5-85bc-4518-ed68dfcaa613 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:45.659 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:56e11426-e8a5-85bc-4518-ed68dfcaa613 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:37:45.703 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.309s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:45.704 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:0e5e568c-0194-70cb-faf2-71518885b8f4 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:37:45.705 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.045s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:45.714 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:0e5e568c-0194-70cb-faf2-71518885b8f4 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:37:45.809 WARNING nova.virt.configdrive [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:37:45.811 DEBUG nova.objects.instance [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:45.849 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmpYMC4pk/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpets_vB execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:45.949 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmpYMC4pk/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpets_vB" returned: 0 in 0.100s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:45.956 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYMC4pk/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:47.150 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.445s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:47.153 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:56e11426-e8a5-85bc-4518-ed68dfcaa613 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:37:47.157 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:56e11426-e8a5-85bc-4518-ed68dfcaa613 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:37:47.264 WARNING nova.virt.configdrive [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:37:47.265 DEBUG nova.objects.instance [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:47.305 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmpprSMUp/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpdPRvRL execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:47.486 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmpprSMUp/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpdPRvRL" returned: 0 in 0.180s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:47.492 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpprSMUp/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:48.353 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:48.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:48.529 INFO nova.compute.manager [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Starting instance... 2015-08-07 17:37:48.561 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:37:48.562 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:37:48.795 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:48.796 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:37:49.239 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:49.240 DEBUG nova.compute.resource_tracker [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:37:49.253 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:37:49.256 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:37:49.257 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:37:49.257 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:37:49.257 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] disk limit not specified, defaulting to unlimited 2015-08-07 17:37:49.297 DEBUG nova.compute.resources.vcpu [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:37:49.298 DEBUG nova.compute.resources.vcpu [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:37:49.299 INFO nova.compute.claims [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Claim successful 2015-08-07 17:37:49.965 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.170s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:50.318 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" released by "instance_claim" :: held 1.079s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:50.539 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:37:50.540 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:37:50.542 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:37:50.542 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:50.812 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:37:50.814 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=719MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:37:51.230 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:37:51.230 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.688s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:51.231 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" acquired by "update_usage" :: waited 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:51.233 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:51.546 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" released by "update_usage" :: held 0.315s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:51.549 DEBUG nova.compute.utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:37:51.554 13318 DEBUG nova.compute.manager [-] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:37:51.555 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-fb35bfe9-6df7-4eee-8e97-1d149632b872" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:37:52.658 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:37:52.672 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:37:52.673 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:53.269 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:37:53.280 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:54.504 13318 DEBUG nova.network.base_api [-] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:3b:ff:49', 'active': False, 'type': u'bridge', 'id': u'3cbd975c-f0f5-4f1f-8a6e-9e7c36738c7f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:37:54.534 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-fb35bfe9-6df7-4eee-8e97-1d149632b872" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:37:54.535 13318 DEBUG nova.compute.manager [-] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:3b:ff:49', 'active': False, 'type': u'bridge', 'id': u'3cbd975c-f0f5-4f1f-8a6e-9e7c36738c7f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:37:55.172 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:56.112 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYMC4pk/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.157s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:56.114 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:56.232 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:37:56.236 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 41.28 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:37:56.290 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Cloned VDI OpaqueRef:c30d167e-f76e-294f-9cc8-88518c16cdb5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:37:56.744 DEBUG oslo_concurrency.processutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.629s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:56.745 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:37:56.746 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:57.076 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.796s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:57.079 INFO nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Image creation data, cacheable: True, downloaded: False duration: 3.81 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:37:57.528 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpprSMUp/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 10.036s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:57.530 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:37:57.963 DEBUG oslo_concurrency.processutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.432s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:37:57.966 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:37:57.989 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.242s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:57.990 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.023s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:58.007 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:58.021 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:37:58.022 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:cab06afa-6943-60a8-77cb-7aa635dc63cb, VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:58.035 DEBUG nova.virt.xenapi.vm_utils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:000860c9-749a-b906-e01c-13f882776709 for VM OpaqueRef:cab06afa-6943-60a8-77cb-7aa635dc63cb, VDI OpaqueRef:6735e473-c9a4-0155-0fc4-b63c2cfc63e1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:58.036 DEBUG nova.objects.instance [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:58.160 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:58.310 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:58.449 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:58.450 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:58.451 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:58.461 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:58.462 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting hostname (tempest.common.compute-instance-822491913) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:37:58.463 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:58.472 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:58.473 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:37:58.474 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:58.544 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:37:58.557 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:37:58.558 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:58.713 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_nwinfo" :: held 0.239s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:58.714 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:58.850 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Creating disk-type VBD for VM OpaqueRef:a714109c-9440-132f-f1e3-a115d4c4f571, VDI OpaqueRef:c30d167e-f76e-294f-9cc8-88518c16cdb5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:58.862 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Created VBD OpaqueRef:a742042d-9cb1-09b3-d600-f52e921eb3da for VM OpaqueRef:a714109c-9440-132f-f1e3-a115d4c4f571, VDI OpaqueRef:c30d167e-f76e-294f-9cc8-88518c16cdb5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:59.007 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:37:59.027 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:37:59.036 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VIF OpaqueRef:b36ff9e5-8b13-fb8e-3d59-da0354811b55, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:37:59.036 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:59.055 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.064s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:59.063 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:37:59.064 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:0761bfe0-2888-9816-5f27-1d443e70c099, VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:59.073 DEBUG nova.virt.xenapi.vm_utils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:234c385b-5bc4-1330-f399-af58b616df2d for VM OpaqueRef:0761bfe0-2888-9816-5f27-1d443e70c099, VDI OpaqueRef:95190d45-6090-cf5b-f4cd-560250f88824. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:59.073 DEBUG nova.objects.instance [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:37:59.198 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:37:59.336 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:37:59.338 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Created VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:37:59.346 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:37:59.382 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Created VBD OpaqueRef:f9b1c208-4556-a8d1-061a-0a5b1bfcd032 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:37:59.382 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Plugging VBD OpaqueRef:f9b1c208-4556-a8d1-061a-0a5b1bfcd032 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:37:59.383 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:59.447 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:59.448 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:59.448 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:59.460 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:59.461 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Injecting hostname (tempest.common.compute-instance-1753264082) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:37:59.461 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:59.485 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "update_hostname" :: held 0.023s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:59.486 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:37:59.488 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:37:59.758 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "update_nwinfo" :: held 0.270s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:37:59.759 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:00.047 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:38:00.057 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:38:00.067 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Created VIF OpaqueRef:cacf07ea-b79f-d671-c5f6-0074006c8451, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:38:00.067 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:00.337 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:38:03.687 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 4.304s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:03.688 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Plugging VBD OpaqueRef:f9b1c208-4556-a8d1-061a-0a5b1bfcd032 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:38:03.692 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] VBD OpaqueRef:f9b1c208-4556-a8d1-061a-0a5b1bfcd032 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:38:03.784 WARNING nova.virt.configdrive [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:38:03.785 DEBUG nova.objects.instance [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lazy-loading `ec2_ids' on Instance uuid fb35bfe9-6df7-4eee-8e97-1d149632b872 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:38:03.841 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Running cmd (subprocess): genisoimage -o /tmp/tmp7rrkGX/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbsItF2 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:38:03.981 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] CMD "genisoimage -o /tmp/tmp7rrkGX/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpbsItF2" returned: 0 in 0.140s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:38:03.988 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7rrkGX/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:38:05.229 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.71 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:13.601 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp7rrkGX/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 9.613s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:38:13.603 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:38:14.166 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:38:14.232 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:14.686 DEBUG oslo_concurrency.processutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.083s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:38:14.687 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Destroying VBD for VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:38:14.688 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:15.141 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:38:15.144 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:38:15.146 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:15.215 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:15.233 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.087s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:15.234 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:15.975 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:38:16.110 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:16.125 DEBUG nova.virt.xenapi.vmops [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:16.616 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:38:16.617 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:38:16.618 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:16.632 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "update_hostname" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:16.633 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:16.853 DEBUG nova.compute.manager [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:17.123 DEBUG nova.virt.xenapi.vmops [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:17.309 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.621s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:17.319 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Destroying VBD for VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:38:17.320 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Creating disk-type VBD for VM OpaqueRef:a714109c-9440-132f-f1e3-a115d4c4f571, VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:38:17.336 DEBUG nova.virt.xenapi.vm_utils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Created VBD OpaqueRef:b323ecc6-20bf-c1a7-2714-3ea3f551bc6d for VM OpaqueRef:a714109c-9440-132f-f1e3-a115d4c4f571, VDI OpaqueRef:bab8fdf3-df5c-98ef-a672-3f08a7c439ef. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:38:17.344 DEBUG nova.objects.instance [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lazy-loading `pci_devices' on Instance uuid fb35bfe9-6df7-4eee-8e97-1d149632b872 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:38:17.364 DEBUG oslo_concurrency.lockutils [req-8c8b8a16-6a57-4e6a-94ef-e76b7301b92d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "_locked_do_build_and_run_instance" :: held 42.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:17.419 DEBUG nova.compute.manager [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:17.505 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:17.975 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:17.976 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:17.977 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:17.995 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "store_auto_disk_config" :: held 0.019s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:18.016 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Injecting hostname (tempest.common.compute-instance-1801093627) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:38:18.017 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:18.032 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "update_hostname" :: held 0.015s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:18.033 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:38:18.034 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:18.043 DEBUG oslo_concurrency.lockutils [req-2108947d-ad5a-4bff-8a98-cdacf0fd5a34 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "_locked_do_build_and_run_instance" :: held 41.346s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:18.261 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "update_nwinfo" :: held 0.227s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:18.262 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:18.596 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:38:18.606 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:38:18.616 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Created VIF OpaqueRef:7350ebcf-5f79-ffbf-4215-55911eec9ea2, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:38:18.616 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:18.973 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:38:19.962 DEBUG nova.compute.manager [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:20.255 INFO nova.compute.manager [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] instance snapshotting 2015-08-07 17:38:20.260 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:38:20.287 DEBUG oslo_concurrency.lockutils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:20.288 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:21.094 DEBUG oslo_concurrency.lockutils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.806s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:21.105 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:21.139 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 2a319024-f9a8-4bee-9c30-22f8a8f2e280 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:21.165 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 2584b787-92a9-431e-9da3-83072f3e6e7b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:21.179 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:21.199 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:21.230 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:38:25.029 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:26.204 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:38:26.218 DEBUG oslo_concurrency.lockutils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:26.219 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:26.999 DEBUG oslo_concurrency.lockutils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.781s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:27.013 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 8310f293-81ae-4a71-8e71-9115e63af96c has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:27.025 DEBUG nova.virt.xenapi.vm_utils [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:27.316 DEBUG nova.virt.xenapi.client.session [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:38:31.227 DEBUG nova.compute.manager [req-d2f6827e-baa1-44fd-a430-573a8f2bc9f8 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:33.327 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:33.328 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:33.328 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:33.330 INFO nova.compute.manager [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Terminating instance 2015-08-07 17:38:33.334 INFO nova.virt.xenapi.vmops [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Destroying VM 2015-08-07 17:38:33.347 DEBUG nova.virt.xenapi.vm_utils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:38:33.725 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:38:33.758 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:33.844 DEBUG nova.virt.xenapi.vmops [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Finished snapshot and upload for VM, duration: 13.58 secs for image d7d1d44c-7b2e-4371-9daf-db4e08dc02b3 snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:38:34.034 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:38:34.035 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:38:34.035 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:34.041 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "xenstore-fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:34.042 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:34.185 DEBUG nova.compute.manager [req-2a37101c-43a1-49bc-ad84-10875e6df044 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Found 1 images (rotation: 2) _rotate_backups /opt/stack/new/nova/nova/compute/manager.py:3001 2015-08-07 17:38:34.666 DEBUG nova.virt.xenapi.vmops [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:34.914 DEBUG nova.compute.manager [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:35.003 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:35.318 DEBUG oslo_concurrency.lockutils [req-fcbbd617-69ab-4e56-995d-7d5d14a23d19 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "_locked_do_build_and_run_instance" :: held 46.965s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:36.637 DEBUG nova.compute.manager [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:38:36.868 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:36.869 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:36.869 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:36.871 INFO nova.compute.manager [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Terminating instance 2015-08-07 17:38:36.873 INFO nova.virt.xenapi.vmops [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Destroying VM 2015-08-07 17:38:36.888 DEBUG nova.virt.xenapi.vm_utils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:38:36.936 INFO nova.compute.manager [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] instance snapshotting 2015-08-07 17:38:36.943 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:38:37.010 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:37.010 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:37.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:37.647 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.637s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:37.655 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.662 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.701 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 2a319024-f9a8-4bee-9c30-22f8a8f2e280 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.720 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 2584b787-92a9-431e-9da3-83072f3e6e7b has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.727 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.750 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:37.766 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:38:38.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:38.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:39.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:39.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:38:39.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:38:39.601 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:38:39.601 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:38:39.603 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:38:39.603 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:38:39.961 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:38:39.998 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:38:39.998 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:38:40.002 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:40.014 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:40.015 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:40.695 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.681s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:40.701 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 8f88e745-a471-41ce-8c19-22928cca8925 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:40.702 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent 8f88e745-a471-41ce-8c19-22928cca8925 not yet in parent list ['236ca383-573a-4bb3-a64c-2c4f7d42d86c', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:38:40.995 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:40.996 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:41.097 DEBUG nova.virt.xenapi.vmops [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:38:41.107 DEBUG nova.virt.xenapi.vm_utils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] VDI 2a319024-f9a8-4bee-9c30-22f8a8f2e280 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:38:41.116 DEBUG nova.virt.xenapi.vm_utils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] VDI 861d4567-4d1f-431d-86a3-be118d72287e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:38:41.977 DEBUG nova.virt.xenapi.vmops [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:38:41.995 DEBUG nova.virt.xenapi.vm_utils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:38:41.996 DEBUG nova.compute.manager [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:38:43.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:43.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:38:43.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:44.030 DEBUG nova.compute.manager [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:37:47Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=50,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=fb35bfe9-6df7-4eee-8e97-1d149632b872,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:37:51Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:38:44.081 DEBUG nova.virt.xenapi.vmops [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:38:44.090 DEBUG nova.virt.xenapi.vm_utils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 2584b787-92a9-431e-9da3-83072f3e6e7b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:38:44.099 DEBUG nova.virt.xenapi.vm_utils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 26bae6f1-7d59-4e22-a8f2-1f26e5372563 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:38:44.355 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:44.356 DEBUG nova.objects.instance [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lazy-loading `numa_topology' on Instance uuid fb35bfe9-6df7-4eee-8e97-1d149632b872 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:38:44.501 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "compute_resources" released by "update_usage" :: held 0.147s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:44.950 DEBUG oslo_concurrency.lockutils [req-dec129ac-6ab8-4698-95b1-7febe1af00d7 tempest-ServerAddressesTestJSON-1855640315 tempest-ServerAddressesTestJSON-1854570065] Lock "fb35bfe9-6df7-4eee-8e97-1d149632b872" released by "do_terminate_instance" :: held 8.082s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:45.062 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:45.363 DEBUG nova.virt.xenapi.vmops [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:38:45.379 DEBUG nova.virt.xenapi.vm_utils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:38:45.381 DEBUG nova.compute.manager [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:38:45.703 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:45.704 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:46.514 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.811s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:46.521 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 8f88e745-a471-41ce-8c19-22928cca8925 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:46.522 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent 8f88e745-a471-41ce-8c19-22928cca8925 not yet in parent list ['236ca383-573a-4bb3-a64c-2c4f7d42d86c', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:38:46.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:46.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:46.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:47.542 DEBUG nova.compute.manager [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:37:36Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=49,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:37:37Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:38:47.767 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:47.768 DEBUG nova.objects.instance [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:38:47.878 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.111s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:48.259 DEBUG oslo_concurrency.lockutils [req-1e8b5ee6-da73-43fe-8a7e-4d5458b48598 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd" released by "do_terminate_instance" :: held 14.933s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:49.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:49.557 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:38:49.558 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:38:50.156 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:50.156 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:50.593 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:50.757 INFO nova.compute.manager [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Starting instance... 2015-08-07 17:38:50.889 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.733s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:51.512 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:51.513 DEBUG nova.compute.resource_tracker [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:38:51.521 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:38:51.522 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:38:51.522 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:38:51.523 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:38:51.523 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] disk limit not specified, defaulting to unlimited 2015-08-07 17:38:51.525 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:51.526 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:51.552 DEBUG nova.compute.resources.vcpu [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:38:51.553 DEBUG nova.compute.resources.vcpu [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:38:51.554 INFO nova.compute.claims [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Claim successful 2015-08-07 17:38:51.898 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:38:51.899 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:38:51.900 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:38:52.302 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "instance_claim" :: held 0.790s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:52.313 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.413s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:52.487 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.961s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:52.494 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 8f88e745-a471-41ce-8c19-22928cca8925 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:52.495 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent 8f88e745-a471-41ce-8c19-22928cca8925 not yet in parent list ['236ca383-573a-4bb3-a64c-2c4f7d42d86c', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:38:52.636 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:38:52.644 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:38:52.722 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:38:52.723 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.410s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:52.723 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.038s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:52.726 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:52.866 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.142s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:52.867 DEBUG nova.compute.utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:38:52.874 13318 DEBUG nova.compute.manager [-] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:38:52.876 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:38:53.827 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:38:53.863 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:38:53.864 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:54.725 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:38:54.768 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:55.152 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.79 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:55.911 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:56.062 INFO nova.compute.manager [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance... 2015-08-07 17:38:56.523 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:56.523 DEBUG nova.compute.resource_tracker [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:38:56.531 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:38:56.532 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:38:56.532 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:38:56.532 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:38:56.533 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] disk limit not specified, defaulting to unlimited 2015-08-07 17:38:56.568 DEBUG nova.compute.resources.vcpu [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:38:56.569 DEBUG nova.compute.resources.vcpu [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:38:56.570 INFO nova.compute.claims [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Claim successful 2015-08-07 17:38:56.841 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Cloned VDI OpaqueRef:18c25660-ccf1-0b8d-a51c-0eea85b46070 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:38:56.991 13318 DEBUG nova.network.base_api [-] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:06:21:3a', 'active': False, 'type': u'bridge', 'id': u'659f91d5-7ed5-46c5-bdef-465c4e42ffe1', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:38:57.009 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" released by "instance_claim" :: held 0.487s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:57.029 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:38:57.030 13318 DEBUG nova.compute.manager [-] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:06:21:3a', 'active': False, 'type': u'bridge', 'id': u'659f91d5-7ed5-46c5-bdef-465c4e42ffe1', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:38:57.275 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:57.389 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" released by "update_usage" :: held 0.114s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:57.390 DEBUG nova.compute.utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:38:57.396 13318 DEBUG nova.compute.manager [-] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:38:57.397 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:38:57.497 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:57.498 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:57.841 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.074s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:57.842 INFO nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Image creation data, cacheable: True, downloaded: False duration: 3.12 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:38:58.199 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:38:58.303 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:38:58.304 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:38:58.312 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.815s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:58.320 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:58.321 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Coalesce detected, because parent is: 236ca383-573a-4bb3-a64c-2c4f7d42d86c _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2118 2015-08-07 17:38:58.333 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:58.333 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:38:58.732 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:38:58.733 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 38.79 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:38:58.892 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:38:58.941 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:38:59.776 DEBUG oslo_concurrency.lockutils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.443s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:38:59.787 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD a30a4583-4b05-4cce-8f14-fe3f08af0a9a has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:59.796 DEBUG nova.virt.xenapi.vm_utils [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:38:59.962 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:00.116 DEBUG nova.virt.xenapi.client.session [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:39:00.315 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:00.766 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:39:00.779 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:39:00.780 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:01.238 13318 DEBUG nova.network.base_api [-] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:01.262 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:8ddb95f7-94dc-5722-2d1e-6f7a94448135, VDI OpaqueRef:18c25660-ccf1-0b8d-a51c-0eea85b46070 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:01.270 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:4a6f353e-b93f-f5b9-4dcc-864bdde46e1e for VM OpaqueRef:8ddb95f7-94dc-5722-2d1e-6f7a94448135, VDI OpaqueRef:18c25660-ccf1-0b8d-a51c-0eea85b46070. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:01.291 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:01.292 13318 DEBUG nova.compute.manager [-] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:39:03.342 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:39:03.350 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:03.392 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:f4541fe7-91a1-bc51-e01f-46b4f73e7c1f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:03.393 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:f4541fe7-91a1-bc51-e01f-46b4f73e7c1f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:39:03.394 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:06.322 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.928s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:06.322 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:f4541fe7-91a1-bc51-e01f-46b4f73e7c1f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:39:06.327 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:f4541fe7-91a1-bc51-e01f-46b4f73e7c1f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:39:06.429 WARNING nova.virt.configdrive [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:39:06.430 DEBUG nova.objects.instance [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:06.472 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmpytzpFl/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJggJ6v execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:06.608 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.34 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:06.613 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmpytzpFl/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJggJ6v" returned: 0 in 0.140s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:06.619 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpytzpFl/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:08.838 DEBUG nova.virt.xenapi.vmops [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Finished snapshot and upload for VM, duration: 31.89 secs for image b5ccfe40-4b8a-4588-be04-edf5d7642898 snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:39:09.586 DEBUG nova.compute.manager [req-68e1ab7c-9255-4c5b-bb92-033977792a30 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Found 2 images (rotation: 2) _rotate_backups /opt/stack/new/nova/nova/compute/manager.py:3001 2015-08-07 17:39:11.192 DEBUG nova.compute.manager [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:39:11.440 INFO nova.compute.manager [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] instance snapshotting 2015-08-07 17:39:11.447 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:39:11.483 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:11.483 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:12.236 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.753s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:12.249 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:12.260 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:12.316 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD bcad92ce-8630-4378-8539-8e30b03b6265 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:12.349 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:12.386 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:12.398 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:39:14.993 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpytzpFl/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 8.374s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:14.995 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:15.078 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:15.384 DEBUG oslo_concurrency.processutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.389s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:15.386 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:39:15.388 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:16.090 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:16.090 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:16.530 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.440s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:16.536 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent b31756c1-8a59-4573-b13c-c1e2d2a40833 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:16.538 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent b31756c1-8a59-4573-b13c-c1e2d2a40833 not yet in parent list ['236ca383-573a-4bb3-a64c-2c4f7d42d86c', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:39:16.600 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.212s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:16.608 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:39:16.609 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:8ddb95f7-94dc-5722-2d1e-6f7a94448135, VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:16.618 DEBUG nova.virt.xenapi.vm_utils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:c0518ab4-82da-38f8-25c6-91ba2dc5a3ff for VM OpaqueRef:8ddb95f7-94dc-5722-2d1e-6f7a94448135, VDI OpaqueRef:870b45d3-b404-9617-cd22-9f8252c611a3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:16.619 DEBUG nova.objects.instance [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:16.750 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:17.006 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:17.007 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:17.008 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:17.017 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:17.018 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Injecting hostname (tempest.common.compute-instance-735250200) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:39:17.019 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:17.026 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:17.027 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:39:17.028 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:17.229 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "update_nwinfo" :: held 0.202s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:17.230 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:17.493 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:39:17.502 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:39:17.510 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Created VIF OpaqueRef:0d6ca815-cfac-bce1-83ec-fd28edd2b2fc, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:39:17.511 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:17.755 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:39:17.764 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Cloned VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:39:18.544 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 19.603s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:18.545 INFO nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Image creation data, cacheable: True, downloaded: False duration: 19.65 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:39:19.497 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:19.729 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:19.948 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:39:19.962 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:39:19.963 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:20.168 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:77080ae9-072d-dceb-43e8-ea8b51092033, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:20.175 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:10708b79-0ef1-8af4-a7c7-d7a6016580d4 for VM OpaqueRef:77080ae9-072d-dceb-43e8-ea8b51092033, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:20.510 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:39:20.513 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:20.525 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:54a208d6-cb09-0105-1c6c-43d52d18380b for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:20.525 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:54a208d6-cb09-0105-1c6c-43d52d18380b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:39:20.527 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:21.539 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:21.539 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:21.920 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.394s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:21.921 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:54a208d6-cb09-0105-1c6c-43d52d18380b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:39:21.925 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VBD OpaqueRef:54a208d6-cb09-0105-1c6c-43d52d18380b plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:39:22.033 WARNING nova.virt.configdrive [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:39:22.035 DEBUG nova.objects.instance [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `ec2_ids' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:22.074 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): genisoimage -o /tmp/tmp4oO14Q/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqs93ac execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:22.251 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.712s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:22.264 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:22.265 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Coalesce detected, because parent is: 236ca383-573a-4bb3-a64c-2c4f7d42d86c _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2118 2015-08-07 17:39:22.279 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "genisoimage -o /tmp/tmp4oO14Q/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqs93ac" returned: 0 in 0.204s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:22.283 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4oO14Q/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:22.391 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:22.394 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:23.216 DEBUG oslo_concurrency.lockutils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.825s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:23.231 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD f479c23f-9e76-4be9-9e81-aff419e3e8fe has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:23.244 DEBUG nova.virt.xenapi.vm_utils [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:23.612 DEBUG nova.virt.xenapi.client.session [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:39:25.047 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:25.345 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:39:25.379 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:25.860 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:39:25.862 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:39:25.863 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:25.869 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:25.870 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:26.274 DEBUG nova.virt.xenapi.vmops [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:26.828 DEBUG nova.compute.manager [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:39:28.477 DEBUG oslo_concurrency.lockutils [req-c8abbac4-9689-4c28-8e6e-6270f781eaca tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" released by "_locked_do_build_and_run_instance" :: held 37.884s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:29.067 DEBUG nova.virt.xenapi.vmops [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Finished snapshot and upload for VM, duration: 17.62 secs for image 693e9a17-a1bd-49dd-8858-208dc2eb2e4b snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:39:29.142 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4oO14Q/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.859s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:29.144 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:29.555 DEBUG nova.compute.manager [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:39:29.615 DEBUG oslo_concurrency.processutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.471s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:29.616 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:39:29.617 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:29.703 DEBUG nova.compute.manager [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Found 3 images (rotation: 2) _rotate_backups /opt/stack/new/nova/nova/compute/manager.py:3001 2015-08-07 17:39:29.703 DEBUG nova.compute.manager [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Rotating out 1 backups _rotate_backups /opt/stack/new/nova/nova/compute/manager.py:3008 2015-08-07 17:39:29.704 DEBUG nova.compute.manager [req-dd13ae45-6c6f-4280-aef4-ac3436e483d0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Deleting image d7d1d44c-7b2e-4371-9daf-db4e08dc02b3 _rotate_backups /opt/stack/new/nova/nova/compute/manager.py:3013 2015-08-07 17:39:30.015 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:30.016 DEBUG nova.compute.resource_tracker [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:39:30.025 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:39:30.026 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:39:30.026 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:39:30.027 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:39:30.027 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] disk limit not specified, defaulting to unlimited 2015-08-07 17:39:30.054 DEBUG nova.compute.resources.vcpu [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:39:30.055 DEBUG nova.compute.resources.vcpu [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:39:30.057 INFO nova.compute.claims [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Claim successful 2015-08-07 17:39:30.119 INFO nova.compute.resource_tracker [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Updating from migration 7fface03-3fde-4610-ae05-ed86066e44da 2015-08-07 17:39:30.219 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "resize_claim" :: held 0.204s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:30.220 INFO nova.compute.manager [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrating 2015-08-07 17:39:30.290 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Acquired semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:30.477 DEBUG nova.network.base_api [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:06:21:3a', 'active': False, 'type': u'bridge', 'id': u'659f91d5-7ed5-46c5-bdef-465c4e42ffe1', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:30.492 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.875s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:30.499 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:39:30.500 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:77080ae9-072d-dceb-43e8-ea8b51092033, VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:30.510 DEBUG nova.virt.xenapi.vm_utils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:baec3870-b5c4-ce20-b396-17af69e6ee72 for VM OpaqueRef:77080ae9-072d-dceb-43e8-ea8b51092033, VDI OpaqueRef:294fdec3-6fde-11d6-2cfe-665859b93213. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:30.510 DEBUG nova.objects.instance [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `pci_devices' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:30.543 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Releasing semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:30.664 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:30.895 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:30.896 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:30.898 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:30.906 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:30.907 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting hostname (tempest.common.compute-instance-1670852847) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:39:30.908 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:30.918 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:30.927 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.020s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:30.928 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:39:30.929 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:31.108 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_nwinfo" :: held 0.180s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:31.109 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:31.131 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:39:31.147 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:31.148 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:31.353 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:39:31.363 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:39:31.371 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VIF OpaqueRef:d815f081-270a-341b-92a7-d743c978151c, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:39:31.372 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:31.567 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.420s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:31.576 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD bcad92ce-8630-4378-8539-8e30b03b6265 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.582 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:39:31.647 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD f2a8528d-921a-476d-9e2a-7625f31d91f7 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.666 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD bcad92ce-8630-4378-8539-8e30b03b6265 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.675 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 236ca383-573a-4bb3-a64c-2c4f7d42d86c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.694 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc has parent 236ca383-573a-4bb3-a64c-2c4f7d42d86c _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.714 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:31.725 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:39:33.277 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:39:33.285 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:33.286 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:33.802 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.517s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:33.818 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 1ac01379-f7ff-4307-8c65-769672d54caf has parent 78bbc129-53d2-4dfe-964c-f182c80e7259 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:33.828 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VHD 78bbc129-53d2-4dfe-964c-f182c80e7259 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:39:33.840 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:34.096 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrating VHD '78bbc129-53d2-4dfe-964c-f182c80e7259' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:39:34.806 INFO nova.compute.manager [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Rebooting instance 2015-08-07 17:39:34.844 DEBUG oslo_concurrency.lockutils [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:35.016 DEBUG nova.network.base_api [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:35.049 DEBUG oslo_concurrency.lockutils [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:35.051 DEBUG nova.compute.manager [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:39:35.094 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:35.382 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:39:36.801 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:39:36.802 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:37.310 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:39:37.317 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:39:37.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:37.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:38.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:38.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:39.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:39.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:39:39.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:39:39.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:39:39.603 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:39.603 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:39.977 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:40.023 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:40.024 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:39:40.024 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:40.302 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:39:40.336 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:40.718 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:39:40.719 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:39:40.720 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:40.728 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:40.729 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:41.024 DEBUG nova.virt.xenapi.vmops [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:41.277 DEBUG nova.compute.manager [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:39:41.628 DEBUG oslo_concurrency.lockutils [req-bcc60a1b-546c-4936-b069-c848b7de8f79 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "_locked_do_build_and_run_instance" :: held 45.717s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:42.016 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:42.017 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.49 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:42.785 INFO nova.compute.manager [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Rescuing 2015-08-07 17:39:42.786 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:42.992 DEBUG nova.network.base_api [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:43.020 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:43.417 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Migrating VHD 'bcad92ce-8630-4378-8539-8e30b03b6265' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:39:43.512 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:43.600 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:43.612 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:43.615 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:39:43.618 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:44.126 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:39:44.611 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:44.855 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:45.011 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:45.871 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:45.872 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:47.053 DEBUG nova.compute.manager [req-52fdcd3e-d0b9-4d91-b0fe-7a94a76db5c4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:39:47.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:47.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:47.742 WARNING nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] VM already halted, skipping shutdown... 2015-08-07 17:39:47.772 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:39:47.773 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:48.031 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:48.052 INFO nova.compute.manager [req-cec81d48-a976-426c-bd6b-9c1a52b3f27c tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Get console output 2015-08-07 17:39:48.081 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Acquired semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:48.263 DEBUG nova.network.base_api [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:06:21:3a', 'active': False, 'type': u'bridge', 'id': u'659f91d5-7ed5-46c5-bdef-465c4e42ffe1', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:48.293 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Releasing semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:48.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:48.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:48.646 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:39:48.647 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:39:49.311 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Cloned VDI OpaqueRef:1748ce76-4caf-e097-c1c5-50891d5651d5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:39:49.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:49.559 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:39:49.560 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:39:49.916 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:49.929 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:50.317 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.286s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:50.318 INFO nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Image creation data, cacheable: True, downloaded: False duration: 2.29 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:39:50.411 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:50.468 INFO nova.compute.manager [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Starting instance... 2015-08-07 17:39:50.639 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.723s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:50.641 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.687s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:50.642 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:39:50.689 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:50.690 DEBUG nova.compute.resource_tracker [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:39:50.695 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:39:50.695 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Total memory: 8187 MB, used: 853.00 MB 2015-08-07 17:39:50.696 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] memory limit: 12280.50 MB, free: 11427.50 MB 2015-08-07 17:39:50.696 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:39:50.697 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] disk limit not specified, defaulting to unlimited 2015-08-07 17:39:50.727 DEBUG nova.compute.resources.vcpu [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:39:50.728 DEBUG nova.compute.resources.vcpu [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:39:50.728 INFO nova.compute.claims [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Claim successful 2015-08-07 17:39:50.962 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:39:50.962 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:39:50.963 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:39:51.307 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.618s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:51.315 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.352s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:51.371 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.730s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:51.474 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:51.664 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:51.879 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:39:51.892 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:39:51.893 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:51.915 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 7fface03-3fde-4610-ae05-ed86066e44da 2015-08-07 17:39:51.916 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:52.099 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:39:52.110 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:39:52.111 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:410815ca-40b9-f936-2336-fd14d6a65182, VDI OpaqueRef:94e588b1-f80a-3ad3-988c-b3418cbeae34 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:52.140 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:39:52.141 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=922MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:39:52.153 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:0d06d38e-b90a-2b8c-46e8-e6c50cf8694d for VM OpaqueRef:410815ca-40b9-f936-2336-fd14d6a65182, VDI OpaqueRef:94e588b1-f80a-3ad3-988c-b3418cbeae34. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:52.246 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:39:52.246 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.931s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:52.247 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.625s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:52.250 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:52.370 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.122s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:52.371 DEBUG nova.compute.utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:39:52.376 13318 DEBUG nova.compute.manager [-] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:39:52.377 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:39:52.528 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:39:52.529 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:1748ce76-4caf-e097-c1c5-50891d5651d5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:52.540 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:62b518d8-4d99-0131-265a-6dcf1ccc5d17 for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:1748ce76-4caf-e097-c1c5-50891d5651d5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:52.579 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:39:52.583 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:52.596 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:a1e4212f-a956-9c87-5f6b-6a40539eb603 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:52.596 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:a1e4212f-a956-9c87-5f6b-6a40539eb603 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:39:52.597 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:53.108 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:39:53.129 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:39:53.131 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:53.374 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:39:53.385 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:54.375 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.778s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:54.376 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Plugging VBD OpaqueRef:a1e4212f-a956-9c87-5f6b-6a40539eb603 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:39:54.380 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VBD OpaqueRef:a1e4212f-a956-9c87-5f6b-6a40539eb603 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:39:54.501 WARNING nova.virt.configdrive [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:39:54.502 DEBUG nova.objects.instance [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `ec2_ids' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:39:54.569 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): genisoimage -o /tmp/tmpoUfOHv/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7PwdWa execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:54.713 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "genisoimage -o /tmp/tmpoUfOHv/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp7PwdWa" returned: 0 in 0.144s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:54.719 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpoUfOHv/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:39:55.058 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:9908e1c4-6006-27c6-b826-319af65bd4f4 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:39:55.112 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:55.140 13318 DEBUG nova.network.base_api [-] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:40:04', 'active': False, 'type': u'bridge', 'id': u'05bc5db0-7a1d-4a7a-afed-5f08489f238b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:39:55.174 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:39:55.175 13318 DEBUG nova.compute.manager [-] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:40:04', 'active': False, 'type': u'bridge', 'id': u'05bc5db0-7a1d-4a7a-afed-5f08489f238b', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:39:56.231 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.845s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:39:56.232 INFO nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 2.86 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:39:57.115 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:57.621 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:58.153 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:39:58.166 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:39:58.167 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:39:58.243 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:39:58.244 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 40.27 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:39:58.510 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:92b5e4a0-351f-5e90-a444-bc14c4e139cb, VDI OpaqueRef:9908e1c4-6006-27c6-b826-319af65bd4f4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:58.530 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:6750001d-9b6d-9f4d-f2aa-2e6f13f65f9e for VM OpaqueRef:92b5e4a0-351f-5e90-a444-bc14c4e139cb, VDI OpaqueRef:9908e1c4-6006-27c6-b826-319af65bd4f4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:59.180 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:39:59.185 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:39:59.196 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:6744b40c-b36c-f2ca-586a-f78b6d606041 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:39:59.197 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:6744b40c-b36c-f2ca-586a-f78b6d606041 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:39:59.197 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:39:59.962 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpoUfOHv/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 5.243s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:39:59.963 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:00.109 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:40:00.116 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:00.129 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:3b014c31-acf3-21dc-3a11-0601bd7ca63a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:00.130 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:3b014c31-acf3-21dc-3a11-0601bd7ca63a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:40:00.431 DEBUG oslo_concurrency.processutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.467s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:00.432 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:40:01.747 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.549s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:01.747 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:6744b40c-b36c-f2ca-586a-f78b6d606041 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:40:01.749 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 1.618s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:01.751 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:6744b40c-b36c-f2ca-586a-f78b6d606041 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:40:01.883 WARNING nova.virt.configdrive [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:40:01.884 DEBUG nova.objects.instance [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:01.922 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmphhoKG1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp30tDmw execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:02.089 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmphhoKG1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp30tDmw" returned: 0 in 0.167s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:02.096 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmphhoKG1/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:03.672 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.923s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:03.676 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:3b014c31-acf3-21dc-3a11-0601bd7ca63a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:40:03.679 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 3.246s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:03.681 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VBD OpaqueRef:3b014c31-acf3-21dc-3a11-0601bd7ca63a plugged as xvde vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:40:03.801 WARNING nova.virt.configdrive [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:40:03.803 DEBUG nova.objects.instance [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `ec2_ids' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:03.855 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): genisoimage -o /tmp/tmp1Xtr5o/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpB8YzwH execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:03.994 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "genisoimage -o /tmp/tmp1Xtr5o/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpB8YzwH" returned: 0 in 0.139s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:04.001 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp1Xtr5o/configdrive of=/dev/xvde oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:05.045 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:05.627 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.948s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:05.639 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Destroying VBD for VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:40:05.640 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Creating disk-type VBD for VM OpaqueRef:410815ca-40b9-f936-2336-fd14d6a65182, VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:05.651 DEBUG nova.virt.xenapi.vm_utils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Created VBD OpaqueRef:1fb25325-6513-4f85-5e10-5184215dd9c4 for VM OpaqueRef:410815ca-40b9-f936-2336-fd14d6a65182, VDI OpaqueRef:0a86fb6d-80b0-4b1c-0bf3-55c739594476. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:05.652 DEBUG nova.objects.instance [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `pci_devices' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:05.800 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:05.801 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:05.801 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:05.811 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:05.811 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:40:05.812 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:06.061 DEBUG oslo_concurrency.lockutils [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "xenstore-7fface03-3fde-4610-ae05-ed86066e44da" released by "update_nwinfo" :: held 0.249s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:06.062 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:40:06.071 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:40:06.081 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Created VIF OpaqueRef:c9128f6d-7908-0e4e-1934-834bc2e2c0bf, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:40:06.081 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:12.289 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmphhoKG1/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 10.194s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:12.291 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:12.609 DEBUG oslo_concurrency.processutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.318s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:12.612 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:40:12.614 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:12.740 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.126s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:12.741 INFO nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:6744b40c-b36c-f2ca-586a-f78b6d606041 uplug failed with "DEVICE_DETACH_REJECTED", attempt 1/11 2015-08-07 17:40:13.743 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:13.967 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp1Xtr5o/configdrive of=/dev/xvde oflag=direct,sync" returned: 0 in 9.967s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:13.970 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:14.387 DEBUG oslo_concurrency.processutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.417s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:14.390 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:40:14.747 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:14.749 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.358s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:14.775 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:40:14.787 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:92b5e4a0-351f-5e90-a444-bc14c4e139cb, VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:14.800 DEBUG nova.virt.xenapi.vm_utils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:c7d953db-d373-e50d-cf07-438b4a719bc6 for VM OpaqueRef:92b5e4a0-351f-5e90-a444-bc14c4e139cb, VDI OpaqueRef:6fc97646-fc3c-be86-ba78-291aae025ddd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:14.801 DEBUG nova.objects.instance [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:14.933 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:15.015 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:15.180 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:15.181 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:15.182 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:15.190 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:15.191 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Injecting hostname (tempest.common.compute-instance-1108268372) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:40:15.192 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:15.201 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:15.202 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:40:15.203 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:15.410 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "update_nwinfo" :: held 0.207s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:15.411 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:15.632 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:40:15.641 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:40:15.653 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Created VIF OpaqueRef:2423fcf1-9d59-4319-e9a3-2abf4a647f15, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:40:15.654 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:15.664 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.915s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:15.672 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:40:15.673 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:15.685 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:2cf133b3-3ee3-37c4-6d1f-5a83eea16d1a for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:53f22481-3742-2ed7-a618-19e19ca98159. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:15.686 DEBUG nova.objects.instance [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `pci_devices' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:15.809 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:15.914 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:15.999 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:16.000 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:16.001 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:16.010 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:16.011 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting hostname (RESCUE-tempest.common.compute-instance-1670852847) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:40:16.011 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:16.020 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:16.020 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:40:16.021 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:16.247 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_nwinfo" :: held 0.226s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:16.248 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:16.510 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:40:16.518 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:40:16.526 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VIF OpaqueRef:693f68aa-c490-d8e8-9b17-4e4bcaba8e4a, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:40:16.527 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:17.010 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:17.023 DEBUG nova.virt.xenapi.vm_utils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:23636ad4-1bcb-6dd8-f1d2-2dbce38cbd92 for VM OpaqueRef:e2a65f3c-bd58-2237-1dfc-89b1914dcf64, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:17.024 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:17.280 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:17.643 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:40:17.663 DEBUG nova.virt.xenapi.vmops [req-6cd3a012-07e0-42a9-b256-0bac3cbfdcde tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:19.693 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:19.693 DEBUG nova.compute.manager [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Going to confirm migration 4 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:40:20.928 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Acquired semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:40:21.090 DEBUG nova.network.base_api [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:06:21:3a', 'active': False, 'type': u'bridge', 'id': u'659f91d5-7ed5-46c5-bdef-465c4e42ffe1', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:40:21.135 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Releasing semaphore "refresh_cache-7fface03-3fde-4610-ae05-ed86066e44da" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:40:21.142 WARNING nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] VM already halted, skipping shutdown... 2015-08-07 17:40:21.153 DEBUG nova.virt.xenapi.vmops [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:40:21.162 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI bcad92ce-8630-4378-8539-8e30b03b6265 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:21.170 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 9aef3b5e-464e-4503-abb4-0942cde5fa5e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:22.080 DEBUG nova.virt.xenapi.vmops [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:40:22.094 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:40:22.182 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:22.253 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "drop_move_claim" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:22.463 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" released by "do_confirm_resize" :: held 2.771s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:22.824 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:22.825 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:22.825 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:22.827 INFO nova.compute.manager [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Terminating instance 2015-08-07 17:40:22.829 INFO nova.virt.xenapi.vmops [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Destroying VM 2015-08-07 17:40:22.839 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:40:23.711 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:40:23.745 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:24.254 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:40:24.255 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:40:24.255 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:24.261 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:24.261 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:24.508 DEBUG nova.virt.xenapi.vmops [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:24.743 DEBUG nova.compute.manager [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:25.069 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:25.200 DEBUG oslo_concurrency.lockutils [req-4adb720f-a6ad-4c12-b44e-323d87e008dd tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "_locked_do_build_and_run_instance" :: held 34.789s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:26.107 DEBUG oslo_concurrency.lockutils [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:26.108 DEBUG nova.compute.manager [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:26.124 DEBUG nova.compute.manager [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:40:26.132 DEBUG nova.virt.xenapi.vm_utils [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:40:26.495 DEBUG nova.virt.xenapi.vmops [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:40:26.504 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI d5f5c013-eb57-43e2-a280-6c70ef2e92aa is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:26.516 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] VDI 749a271b-f638-481d-8d4c-83baffb1b397 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:27.530 DEBUG nova.virt.xenapi.vmops [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:40:27.552 DEBUG nova.virt.xenapi.vm_utils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:40:27.552 DEBUG nova.compute.manager [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:40:27.778 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:40:27.795 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:28.024 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:40:28.025 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:40:28.025 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:28.033 DEBUG oslo_concurrency.lockutils [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:28.035 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:28.320 DEBUG nova.virt.xenapi.vmops [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:28.577 DEBUG nova.compute.manager [req-05d55435-ff4d-44a9-a344-1d4e5f4011cd tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:29.006 DEBUG nova.compute.manager [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:29.248 DEBUG oslo_concurrency.lockutils [req-4eea403e-e9d0-4a71-b4aa-589f295f5eb8 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "do_stop_instance" :: held 3.141s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:29.313 DEBUG nova.compute.manager [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:38:50Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=51,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=7fface03-3fde-4610-ae05-ed86066e44da,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:38:52Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:40:29.538 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:29.539 DEBUG nova.objects.instance [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lazy-loading `numa_topology' on Instance uuid 7fface03-3fde-4610-ae05-ed86066e44da obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:29.615 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "compute_resources" released by "update_usage" :: held 0.078s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:29.713 INFO nova.compute.manager [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Unrescuing 2015-08-07 17:40:29.714 DEBUG oslo_concurrency.lockutils [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:40:29.893 INFO nova.compute.manager [req-c7a2ac41-7597-404c-bf8d-aaf67e9e98f4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Get console output 2015-08-07 17:40:29.911 DEBUG nova.network.base_api [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:40:29.951 DEBUG oslo_concurrency.lockutils [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:40:30.040 DEBUG oslo_concurrency.lockutils [req-dd51020d-20a9-448b-9a1d-f9fcd0701e65 tempest-DeleteServersTestJSON-1440721679 tempest-DeleteServersTestJSON-285208709] Lock "7fface03-3fde-4610-ae05-ed86066e44da" released by "do_terminate_instance" :: held 7.216s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:32.168 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:32.252 INFO nova.compute.manager [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Starting instance... 2015-08-07 17:40:32.461 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:32.462 DEBUG nova.compute.resource_tracker [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:40:32.472 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:40:32.473 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Total memory: 8187 MB, used: 719.00 MB 2015-08-07 17:40:32.473 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] memory limit: 12280.50 MB, free: 11561.50 MB 2015-08-07 17:40:32.475 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:40:32.475 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] disk limit not specified, defaulting to unlimited 2015-08-07 17:40:32.505 DEBUG nova.compute.resources.vcpu [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 3.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:40:32.506 DEBUG nova.compute.resources.vcpu [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:40:32.506 INFO nova.compute.claims [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Claim successful 2015-08-07 17:40:33.027 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.566s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:33.349 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:33.446 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.097s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:33.447 DEBUG nova.compute.utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:40:33.452 13318 DEBUG nova.compute.manager [-] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:40:33.453 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:40:33.548 DEBUG nova.virt.xenapi.vm_utils [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI 384e4127-f69c-4a65-8ade-646a5772520c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:33.561 DEBUG nova.virt.xenapi.vm_utils [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI a6f7a7fd-f57b-4615-9c10-77b629ef0de9 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:33.575 DEBUG nova.virt.xenapi.vm_utils [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI f2a8528d-921a-476d-9e2a-7625f31d91f7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:40:34.023 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:40:34.038 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:40:34.039 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:34.479 DEBUG nova.virt.xenapi.vmops [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:34.532 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:40:34.542 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:35.043 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:35.733 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:0929a371-f19e-2faa-33ce-47ffd91bcc27 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:40:36.094 13318 DEBUG nova.network.base_api [-] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:14:49:68', 'active': False, 'type': u'bridge', 'id': u'e63eea3a-8bbb-438c-b0a8-394f201769a5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:40:36.129 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:40:36.130 13318 DEBUG nova.compute.manager [-] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:14:49:68', 'active': False, 'type': u'bridge', 'id': u'e63eea3a-8bbb-438c-b0a8-394f201769a5', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:40:36.608 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.065s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:36.609 INFO nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 2.08 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:40:37.426 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:37.618 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:37.855 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:40:37.867 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:40:37.868 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:38.113 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:951c2b01-351c-9d1a-ce6c-e17842d2a84f, VDI OpaqueRef:0929a371-f19e-2faa-33ce-47ffd91bcc27 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:38.123 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:d3687d44-1d26-247f-476c-a0295f105e91 for VM OpaqueRef:951c2b01-351c-9d1a-ce6c-e17842d2a84f, VDI OpaqueRef:0929a371-f19e-2faa-33ce-47ffd91bcc27. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:38.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:38.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:38.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:38.640 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:40:38.644 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:38.655 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:3f647931-29d7-6432-cc6d-4af94c162395 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:38.656 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:3f647931-29d7-6432-cc6d-4af94c162395 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:40:38.656 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:39.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:39.520 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:40:39.577 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:39.657 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:40:39.682 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:39.708 INFO nova.compute.manager [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Starting instance... 2015-08-07 17:40:39.993 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:39.994 DEBUG nova.compute.resource_tracker [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:40:40.003 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:40:40.003 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:40:40.004 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:40:40.004 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:40:40.005 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] disk limit not specified, defaulting to unlimited 2015-08-07 17:40:40.031 DEBUG nova.compute.resources.vcpu [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:40:40.032 DEBUG nova.compute.resources.vcpu [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:40:40.032 INFO nova.compute.claims [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Claim successful 2015-08-07 17:40:40.340 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.683s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:40.341 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:3f647931-29d7-6432-cc6d-4af94c162395 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:40:40.346 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:3f647931-29d7-6432-cc6d-4af94c162395 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:40:40.388 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" released by "instance_claim" :: held 0.395s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:40.429 WARNING nova.virt.configdrive [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:40:40.435 DEBUG nova.objects.instance [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid af2ef72d-4895-4de0-bd40-aaa2ac498091 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:40.469 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmpil4SFu/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpEbkwIC execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:40.575 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmpil4SFu/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpEbkwIC" returned: 0 in 0.106s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:40.608 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpil4SFu/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:40.705 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" acquired by "update_usage" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:40.829 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" released by "update_usage" :: held 0.124s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:40.830 DEBUG nova.compute.utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:40:40.839 13318 DEBUG nova.compute.manager [-] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:40:40.841 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-c1501aed-5580-4a88-bf3b-0761bd0b186e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:40:41.591 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:40:41.620 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:40:41.621 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:41.675 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:41.702 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:41.983 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:40:41.994 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:42.406 DEBUG nova.compute.manager [req-3a210da2-ca88-4bb1-895b-d83de9978f1f tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:43.548 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:43.549 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:40:43.550 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:43.836 13318 DEBUG nova.network.base_api [-] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a1:63:53', 'active': False, 'type': u'bridge', 'id': u'913ea4c4-3eb3-44a6-957b-6cea306846b4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:40:43.865 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-c1501aed-5580-4a88-bf3b-0761bd0b186e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:40:43.877 13318 DEBUG nova.compute.manager [-] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a1:63:53', 'active': False, 'type': u'bridge', 'id': u'913ea4c4-3eb3-44a6-957b-6cea306846b4', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:40:44.254 INFO nova.compute.manager [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Rescuing 2015-08-07 17:40:44.256 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:40:44.461 DEBUG nova.network.base_api [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:40:44.498 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:40:44.993 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:40:45.049 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:45.228 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Cloned VDI OpaqueRef:8bfb5aac-addf-51d8-0c5b-87b5f52afb3d from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:40:46.647 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.653s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:46.648 INFO nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Image creation data, cacheable: True, downloaded: False duration: 4.66 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:40:48.051 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:48.313 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:48.464 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpil4SFu/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 7.856s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:48.466 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:48.581 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:48.590 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:40:48.593 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:48.607 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:40:48.608 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:48.842 DEBUG oslo_concurrency.processutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.377s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:48.843 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:40:48.844 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:48.895 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Creating disk-type VBD for VM OpaqueRef:543f10d6-fbe8-c557-4beb-acfc17335e90, VDI OpaqueRef:8bfb5aac-addf-51d8-0c5b-87b5f52afb3d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:48.914 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Created VBD OpaqueRef:18b072a0-0246-5d2f-2e06-42ce20bea3ab for VM OpaqueRef:543f10d6-fbe8-c557-4beb-acfc17335e90, VDI OpaqueRef:8bfb5aac-addf-51d8-0c5b-87b5f52afb3d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:49.047 WARNING nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] VM already halted, skipping shutdown... 2015-08-07 17:40:49.077 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:40:49.078 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:49.321 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:49.392 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Created VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:40:49.397 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:49.409 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Created VBD OpaqueRef:54910449-1c59-dc49-b43c-f35a919eb23a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:49.410 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Plugging VBD OpaqueRef:54910449-1c59-dc49-b43c-f35a919eb23a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:40:49.532 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:49.533 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:49.773 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.929s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:49.775 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.364s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:49.789 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:40:49.790 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:951c2b01-351c-9d1a-ce6c-e17842d2a84f, VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:49.799 DEBUG nova.virt.xenapi.vm_utils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:5db59655-587f-9956-75b0-44ec9dd61c55 for VM OpaqueRef:951c2b01-351c-9d1a-ce6c-e17842d2a84f, VDI OpaqueRef:8f974e47-5816-d8c6-d551-d6246a03e881. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:49.800 DEBUG nova.objects.instance [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid af2ef72d-4895-4de0-bd40-aaa2ac498091 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:49.925 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:50.180 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:50.181 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:50.181 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:50.189 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:50.190 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Injecting hostname (tempest.common.compute-instance-685432625) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:40:50.191 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:50.199 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:50.199 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:40:50.200 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:50.376 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "update_nwinfo" :: held 0.177s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:50.377 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:50.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:50.555 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:40:50.555 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:40:50.601 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:40:50.610 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:40:50.617 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Created VIF OpaqueRef:5f0c5737-7766-bcaf-9ba2-befe11054f0d, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:40:50.618 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:50.812 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:50.813 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:40:50.839 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:51.111 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.336s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:51.111 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Plugging VBD OpaqueRef:54910449-1c59-dc49-b43c-f35a919eb23a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:40:51.115 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] VBD OpaqueRef:54910449-1c59-dc49-b43c-f35a919eb23a plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:40:51.208 WARNING nova.virt.configdrive [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:40:51.209 DEBUG nova.objects.instance [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lazy-loading `ec2_ids' on Instance uuid c1501aed-5580-4a88-bf3b-0761bd0b186e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:51.247 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Running cmd (subprocess): genisoimage -o /tmp/tmp4NIopI/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpcZnFBm execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:51.343 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] CMD "genisoimage -o /tmp/tmp4NIopI/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpcZnFBm" returned: 0 in 0.096s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:51.349 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4NIopI/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:51.435 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.623s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:51.750 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:40:51.751 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:40:51.752 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:40:51.752 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:52.156 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:40:52.156 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:40:52.449 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:40:52.450 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.697s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:52.451 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:55.130 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:57.113 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4NIopI/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 5.764s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:57.115 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:40:57.450 DEBUG oslo_concurrency.processutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.335s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:40:57.454 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Destroying VBD for VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:40:57.455 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:58.460 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:58.468 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Destroying VBD for VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:40:58.469 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Creating disk-type VBD for VM OpaqueRef:543f10d6-fbe8-c557-4beb-acfc17335e90, VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:40:58.479 DEBUG nova.virt.xenapi.vm_utils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Created VBD OpaqueRef:dc1775a0-fa60-a7da-4d12-8fdb161be887 for VM OpaqueRef:543f10d6-fbe8-c557-4beb-acfc17335e90, VDI OpaqueRef:7a2f6801-0640-a839-038a-79b4dc9415a7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:40:58.480 DEBUG nova.objects.instance [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lazy-loading `pci_devices' on Instance uuid c1501aed-5580-4a88-bf3b-0761bd0b186e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:40:58.600 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:58.898 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:58.900 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "store_meta" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:58.901 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:58.950 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "store_auto_disk_config" :: held 0.049s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:58.951 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Injecting hostname (tempest.common.compute-instance-1904216403) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:40:58.951 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:58.971 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "update_hostname" :: held 0.020s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:58.972 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:40:58.972 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:58.975 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:40:59.013 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:59.180 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "update_nwinfo" :: held 0.208s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:59.181 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:59.227 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:40:59.227 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:40:59.228 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:40:59.234 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:40:59.234 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:59.379 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:40:59.387 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:40:59.395 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Created VIF OpaqueRef:520dd95f-68c8-b35c-21e1-f68b0a069ccc, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:40:59.396 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:59.448 DEBUG nova.virt.xenapi.vmops [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:40:59.457 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:40:59.459 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.06 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:40:59.664 DEBUG nova.compute.manager [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:40:59.856 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:40:59.981 DEBUG oslo_concurrency.lockutils [req-8f207af7-579d-48eb-ba25-aa3ab0f99dcc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "_locked_do_build_and_run_instance" :: held 27.813s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:00.487 INFO nova.compute.manager [req-b39c21c5-9e94-4759-ad87-ed41b7aba1aa tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Get console output 2015-08-07 17:41:02.364 DEBUG oslo_concurrency.lockutils [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:02.365 DEBUG nova.compute.manager [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:02.383 DEBUG nova.compute.manager [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:41:02.391 DEBUG nova.virt.xenapi.vm_utils [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:41:04.761 DEBUG nova.compute.manager [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:05.003 DEBUG oslo_concurrency.lockutils [req-e6cca1b1-4e64-4e8b-8567-9c37d2f5774e tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "do_stop_instance" :: held 2.640s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:05.009 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:06.403 DEBUG oslo_concurrency.lockutils [req-850ec07f-4ba8-45e5-82e4-fff4920081ca tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:06.565 DEBUG nova.network.base_api [req-850ec07f-4ba8-45e5-82e4-fff4920081ca tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:06.595 DEBUG oslo_concurrency.lockutils [req-850ec07f-4ba8-45e5-82e4-fff4920081ca tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:06.637 DEBUG nova.virt.xenapi.vmops [req-850ec07f-4ba8-45e5-82e4-fff4920081ca tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:41:06.951 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:41:06.974 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:07.326 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:41:07.326 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:41:07.327 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:07.333 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "xenstore-c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:07.334 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:07.551 DEBUG nova.virt.xenapi.vmops [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:07.766 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Cloned VDI OpaqueRef:70192f66-c7ec-887f-4800-d11fd88ea524 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:41:07.817 DEBUG nova.compute.manager [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:08.255 DEBUG oslo_concurrency.lockutils [req-dd55a0a7-8100-4d29-b040-313179e8137b tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "_locked_do_build_and_run_instance" :: held 28.678s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:08.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:08.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:41:08.612 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 19.291s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:08.613 INFO nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Image creation data, cacheable: True, downloaded: False duration: 19.30 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:41:08.772 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 11 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:41:08.772 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 7fface03-3fde-4610-ae05-ed86066e44da] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:08.987 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fb35bfe9-6df7-4eee-8e97-1d149632b872] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:09.180 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c1e3b756-49fe-4e6c-9016-c4eb2e06a2cd] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:09.418 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 6b6cb831-76dd-4428-ab66-c020997bc153] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:09.569 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:09.647 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 717eb8d1-816f-4ae1-9e4e-fecf5e9205ae] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:09.776 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:09.906 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5ac604dd-d88a-4813-8a3b-bd37a907eee7] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:10.069 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:41:10.099 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:41:10.100 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:10.110 INFO nova.compute.manager [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Rebooting instance 2015-08-07 17:41:10.145 DEBUG oslo_concurrency.lockutils [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Acquired semaphore "refresh_cache-c1501aed-5580-4a88-bf3b-0761bd0b186e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:10.164 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c864db01-8fad-4d62-9c8d-652d30dbaf6e] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:10.288 DEBUG nova.network.base_api [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a1:63:53', 'active': False, 'type': u'bridge', 'id': u'913ea4c4-3eb3-44a6-957b-6cea306846b4', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:10.332 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:41:10.333 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:70192f66-c7ec-887f-4800-d11fd88ea524 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:10.337 DEBUG oslo_concurrency.lockutils [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Releasing semaphore "refresh_cache-c1501aed-5580-4a88-bf3b-0761bd0b186e" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:10.339 DEBUG nova.compute.manager [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:10.344 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:11995d5f-43fa-a3ad-4ff9-a58357f1fa3f for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:70192f66-c7ec-887f-4800-d11fd88ea524. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:10.364 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 962e3fc3-68c5-4019-beb9-2b9f939eb511] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:10.607 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e9775bc6-34f1-465c-9ea5-54d4b3d5a076] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:10.735 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:41:10.739 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:10.758 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:82e2e69d-bd47-1a4c-4294-be1328a6254a for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:10.762 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:82e2e69d-bd47-1a4c-4294-be1328a6254a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:41:10.763 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:10.801 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5722ce9b-957b-4a66-bb52-a0f639736797] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:10.982 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: f5ce5302-4adb-4ac6-be2f-b4371f1d3b62] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:41:11.184 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 30.33 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:12.347 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.584s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:12.348 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:82e2e69d-bd47-1a4c-4294-be1328a6254a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:41:12.351 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VBD OpaqueRef:82e2e69d-bd47-1a4c-4294-be1328a6254a plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:41:12.435 WARNING nova.virt.configdrive [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:41:12.437 DEBUG nova.objects.instance [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `ec2_ids' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:12.473 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): genisoimage -o /tmp/tmpYrjcRD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpp6K8Sg execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:12.569 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "genisoimage -o /tmp/tmpYrjcRD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpp6K8Sg" returned: 0 in 0.096s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:12.575 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYrjcRD/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:14.845 DEBUG nova.compute.manager [req-850ec07f-4ba8-45e5-82e4-fff4920081ca tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:15.039 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:17.254 INFO nova.compute.manager [req-37320d9c-8dc7-49e7-a2da-29b20629b234 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Pausing 2015-08-07 17:41:18.288 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYrjcRD/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 5.714s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:18.290 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:18.369 DEBUG nova.compute.manager [req-37320d9c-8dc7-49e7-a2da-29b20629b234 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:18.708 DEBUG oslo_concurrency.processutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.418s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:18.709 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:41:18.710 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:19.638 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.928s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:19.646 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:41:19.646 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:19.655 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:6a45534e-d232-0f80-000e-a904c4ff4057 for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:7bba7487-2d31-7e05-848e-123673e9f665. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:19.656 DEBUG nova.objects.instance [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `pci_devices' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:19.785 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:20.008 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:20.010 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:20.010 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:20.021 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:20.022 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting hostname (RESCUE-tempest.common.compute-instance-1670852847) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:41:20.023 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:20.034 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:20.035 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:41:20.036 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:20.115 INFO nova.compute.manager [req-0a21358c-18d1-4c52-96bf-5967f897b606 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Unpausing 2015-08-07 17:41:20.230 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_nwinfo" :: held 0.194s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:20.231 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:20.269 DEBUG nova.compute.manager [req-0a21358c-18d1-4c52-96bf-5967f897b606 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:20.412 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:41:20.421 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:41:20.431 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VIF OpaqueRef:076ee6c3-05e4-4446-7c9b-d202b72ebcd2, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:41:20.432 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:20.614 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:20.623 DEBUG nova.virt.xenapi.vm_utils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:9b46d911-96b7-38fb-8000-f01537aff434 for VM OpaqueRef:d08d400b-bcc3-e8d0-1fd4-eb6bf77451ce, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:20.624 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:20.795 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:41:22.151 INFO nova.compute.manager [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Rebooting instance 2015-08-07 17:41:22.183 DEBUG oslo_concurrency.lockutils [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:22.541 DEBUG nova.network.base_api [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:22.570 DEBUG oslo_concurrency.lockutils [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:22.571 DEBUG nova.compute.manager [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:23.416 DEBUG nova.compute.manager [req-5bd9d43b-5785-493f-b58d-635a92ded948 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:25.094 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:25.274 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:25.275 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:25.275 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:25.277 INFO nova.compute.manager [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Terminating instance 2015-08-07 17:41:25.280 INFO nova.virt.xenapi.vmops [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Destroying VM 2015-08-07 17:41:25.293 DEBUG nova.virt.xenapi.vm_utils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:41:28.490 DEBUG nova.virt.xenapi.vmops [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:41:28.497 DEBUG nova.virt.xenapi.vm_utils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] VDI 0613339c-d5dc-404f-8abd-819535c87ba7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:28.512 DEBUG nova.virt.xenapi.vm_utils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] VDI c68c393c-4183-4641-b4cc-2f9712c15a0b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:29.324 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:41:29.351 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:29.518 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:41:29.518 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:41:29.519 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:29.525 DEBUG oslo_concurrency.lockutils [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:29.526 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:29.598 DEBUG nova.virt.xenapi.vmops [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:41:29.615 DEBUG nova.virt.xenapi.vm_utils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:41:29.616 DEBUG nova.compute.manager [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:41:29.700 DEBUG nova.virt.xenapi.vmops [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:29.948 DEBUG nova.compute.manager [req-45dc25ee-efb7-4330-b978-4f985e9e8ac2 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:31.234 DEBUG nova.compute.manager [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:40:39Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=55,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=c1501aed-5580-4a88-bf3b-0761bd0b186e,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:40:40Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:41:31.386 INFO nova.compute.manager [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Unrescuing 2015-08-07 17:41:31.387 DEBUG oslo_concurrency.lockutils [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:31.457 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:31.458 DEBUG nova.objects.instance [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lazy-loading `numa_topology' on Instance uuid c1501aed-5580-4a88-bf3b-0761bd0b186e obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:31.552 DEBUG nova.network.base_api [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:31.557 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "compute_resources" released by "update_usage" :: held 0.100s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:31.587 DEBUG oslo_concurrency.lockutils [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:31.878 DEBUG nova.compute.manager [req-a36b9c2e-d956-4cc7-b216-003f52d4a135 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:31.900 DEBUG oslo_concurrency.lockutils [req-ffede9dc-7366-4b96-9d6f-7e200a076c10 tempest-InstanceActionsTestJSON-1666957028 tempest-InstanceActionsTestJSON-1255250657] Lock "c1501aed-5580-4a88-bf3b-0761bd0b186e" released by "do_terminate_instance" :: held 6.626s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:34.512 INFO nova.compute.manager [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Rebuilding instance 2015-08-07 17:41:34.592 DEBUG nova.compute.manager [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:34.778 INFO nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VM 2015-08-07 17:41:34.798 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:41:34.907 DEBUG nova.virt.xenapi.vm_utils [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI 1853433f-b4c3-461f-8a25-f2fb4140fafd is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:34.919 DEBUG nova.virt.xenapi.vm_utils [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI 07a09f9f-e1bc-4573-ae1a-4a3ef812dc3e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:34.928 DEBUG nova.virt.xenapi.vm_utils [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI f2a8528d-921a-476d-9e2a-7625f31d91f7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:35.031 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:35.714 DEBUG nova.virt.xenapi.vmops [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:41:37.288 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:41:37.298 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI c08c7d45-312b-456b-a179-b719bc1f0fe6 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:37.306 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI d096a8bd-3d0c-4ece-91ed-aa9f1f8917cc is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:41:37.999 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:41:38.014 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:41:38.037 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:38.101 INFO nova.compute.manager [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Starting instance... 2015-08-07 17:41:38.315 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:38.316 DEBUG nova.compute.resource_tracker [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:41:38.324 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:41:38.324 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:41:38.325 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:41:38.325 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:41:38.326 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] disk limit not specified, defaulting to unlimited 2015-08-07 17:41:38.357 DEBUG nova.compute.resources.vcpu [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:41:38.358 DEBUG nova.compute.resources.vcpu [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:41:38.359 INFO nova.compute.claims [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Claim successful 2015-08-07 17:41:38.375 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:41:38.395 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:41:38.397 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:38.563 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:41:38.571 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:38.628 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "instance_claim" :: held 0.313s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:38.813 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:38.896 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "update_usage" :: held 0.083s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:38.897 DEBUG nova.compute.utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:41:38.901 13318 DEBUG nova.compute.manager [-] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:41:38.903 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-393149d8-bc0d-4f72-afbc-954d8344f5e5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:39.379 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:41:39.398 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:41:39.398 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:39.485 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:0e81a814-9fb4-289b-6aec-d3c718679c02 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:41:39.625 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:41:40.103 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.533s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:40.104 INFO nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 1.54 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:41:40.105 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.473s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:40.795 13318 DEBUG nova.network.base_api [-] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a7:5b:8a', 'active': False, 'type': u'bridge', 'id': u'a8a5d25d-2d93-4664-bfc6-cf87b8377436', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:40.829 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-393149d8-bc0d-4f72-afbc-954d8344f5e5" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:40.830 13318 DEBUG nova.compute.manager [-] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:a7:5b:8a', 'active': False, 'type': u'bridge', 'id': u'a8a5d25d-2d93-4664-bfc6-cf87b8377436', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:41:41.030 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Cloned VDI OpaqueRef:c50eef0a-4591-cf45-d571-4a605af04055 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:41:41.163 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:41.339 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:41.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:41.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:41.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:41.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:41.519 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:41:41.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:41.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:41:41.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:41:41.530 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:41:41.530 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:41.604 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:41:41.606 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:41.606 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:41.662 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.557s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:41.663 INFO nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Image creation data, cacheable: True, downloaded: False duration: 2.04 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:41:41.709 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:8fa8d412-a03a-1966-25b4-eac438d86b19, VDI OpaqueRef:0e81a814-9fb4-289b-6aec-d3c718679c02 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:41.717 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:e19947a6-140b-577f-ebf3-c7bbb2d4b074 for VM OpaqueRef:8fa8d412-a03a-1966-25b4-eac438d86b19, VDI OpaqueRef:0e81a814-9fb4-289b-6aec-d3c718679c02. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:41.930 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:41.961 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:41.963 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:41:41.964 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:42.109 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:41:42.116 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:42.140 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:8741bc67-5c5f-a045-ec81-e83e9d7f5f21 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:42.141 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:8741bc67-5c5f-a045-ec81-e83e9d7f5f21 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:41:42.141 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:42.409 DEBUG nova.compute.manager [req-222b0163-2b17-4f0b-8c71-0b8bf95e92e0 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:41:42.482 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:42.688 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:42.907 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:41:42.917 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:41:42.918 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:43.162 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:cf0f3138-7c0a-9ae0-6def-51f97c15a8d1, VDI OpaqueRef:c50eef0a-4591-cf45-d571-4a605af04055 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:43.172 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:a42843e8-44ef-1fb3-4a65-2e971a9250f9 for VM OpaqueRef:cf0f3138-7c0a-9ae0-6def-51f97c15a8d1, VDI OpaqueRef:c50eef0a-4591-cf45-d571-4a605af04055. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:43.449 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.307s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:43.449 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:8741bc67-5c5f-a045-ec81-e83e9d7f5f21 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:41:43.453 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:8741bc67-5c5f-a045-ec81-e83e9d7f5f21 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:41:43.462 INFO nova.compute.manager [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Rescuing 2015-08-07 17:41:43.463 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:41:43.547 WARNING nova.virt.configdrive [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:41:43.547 DEBUG nova.objects.instance [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:43.555 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:41:43.559 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:43.570 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:f14caafd-f78d-323a-9e2e-3338ee0b0579 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:43.570 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Plugging VBD OpaqueRef:f14caafd-f78d-323a-9e2e-3338ee0b0579 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:41:43.571 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:43.595 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmpSmdXD8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp331TCX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:43.681 DEBUG nova.network.base_api [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:41:43.692 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmpSmdXD8/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp331TCX" returned: 0 in 0.097s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:43.696 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSmdXD8/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:43.779 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:41:43.854 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:41:44.964 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:44.965 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:41:44.970 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.54 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:45.276 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:45.841 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.270s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:45.842 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Plugging VBD OpaqueRef:f14caafd-f78d-323a-9e2e-3338ee0b0579 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:41:45.845 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VBD OpaqueRef:f14caafd-f78d-323a-9e2e-3338ee0b0579 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:41:45.933 WARNING nova.virt.configdrive [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:41:45.934 DEBUG nova.objects.instance [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `ec2_ids' on Instance uuid 393149d8-bc0d-4f72-afbc-954d8344f5e5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:45.971 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): genisoimage -o /tmp/tmpQKN8yu/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpWAzqVs execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:46.072 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "genisoimage -o /tmp/tmpQKN8yu/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpWAzqVs" returned: 0 in 0.101s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:46.079 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpQKN8yu/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:46.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:46.598 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:47.848 WARNING nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] VM already halted, skipping shutdown... 2015-08-07 17:41:47.868 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:41:47.869 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:49.136 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:49.604 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:49.605 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:50.015 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSmdXD8/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.319s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:50.017 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:50.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:50.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:50.654 DEBUG oslo_concurrency.processutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.637s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:50.656 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:41:50.657 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:51.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:41:51.560 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:41:51.561 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:41:51.750 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:51.751 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:41:51.802 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Cloned VDI OpaqueRef:73ebe016-3c86-65e6-5431-982d5f79bf88 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:41:51.979 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpQKN8yu/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 5.900s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:51.981 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:52.246 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.588s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.268 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:41:52.269 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:8fa8d412-a03a-1966-25b4-eac438d86b19, VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:52.277 DEBUG nova.virt.xenapi.vm_utils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:b4240919-3c28-c4e6-bb0f-ade9832abcee for VM OpaqueRef:8fa8d412-a03a-1966-25b4-eac438d86b19, VDI OpaqueRef:20b6b503-e500-e37a-3066-895ed505dc6e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:52.280 DEBUG nova.objects.instance [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:52.290 DEBUG oslo_concurrency.processutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.309s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:52.291 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Destroying VBD for VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:41:52.292 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:52.403 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:52.478 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.342s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.479 INFO nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Image creation data, cacheable: True, downloaded: False duration: 3.35 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:41:52.610 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:52.620 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_meta" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.621 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:52.628 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.629 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting hostname (tempest.common.compute-instance-822491913) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:41:52.630 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:52.638 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.638 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:41:52.639 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:52.814 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_nwinfo" :: held 0.175s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:52.815 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.051 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:41:53.062 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:41:53.072 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.780s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:53.080 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VIF OpaqueRef:4975003f-fc26-33be-4a65-b760d35d5f99, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:41:53.081 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.094 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Destroying VBD for VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:41:53.095 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:cf0f3138-7c0a-9ae0-6def-51f97c15a8d1, VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:53.104 DEBUG nova.virt.xenapi.vm_utils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:bf560256-f4a3-8dc6-81d0-c233e36f0ea3 for VM OpaqueRef:cf0f3138-7c0a-9ae0-6def-51f97c15a8d1, VDI OpaqueRef:54bb28cf-8a19-cda5-b8f6-217e05e726d8. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:53.105 DEBUG nova.objects.instance [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `pci_devices' on Instance uuid 393149d8-bc0d-4f72-afbc-954d8344f5e5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:53.166 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.203 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.277 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:41:53.352 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.417 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:53.418 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:53.419 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:53.430 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:53.431 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Injecting hostname (tempest.common.compute-instance-449059911) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:41:53.432 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:53.439 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:53.439 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:41:53.440 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:53.551 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:41:53.563 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:41:53.564 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.609 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "update_nwinfo" :: held 0.169s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:53.610 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:53.743 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:41:53.744 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:73ebe016-3c86-65e6-5431-982d5f79bf88 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:53.753 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:34b8998a-59fa-2c74-771d-da7cdc138ce3 for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:73ebe016-3c86-65e6-5431-982d5f79bf88. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:53.799 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:41:53.807 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:41:53.816 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Created VIF OpaqueRef:53e34617-950e-6b73-86b5-0574372e20d3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:41:53.817 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:41:54.033 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:41:54.139 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:41:54.143 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:41:54.159 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:fe3462fc-f9ef-1634-6c20-063d44639170 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:41:54.160 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:fe3462fc-f9ef-1634-6c20-063d44639170 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:41:54.161 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:41:55.010 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:41:55.706 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.545s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:41:55.707 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Plugging VBD OpaqueRef:fe3462fc-f9ef-1634-6c20-063d44639170 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:41:55.710 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VBD OpaqueRef:fe3462fc-f9ef-1634-6c20-063d44639170 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:41:55.782 WARNING nova.virt.configdrive [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:41:55.783 DEBUG nova.objects.instance [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `ec2_ids' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:41:55.823 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): genisoimage -o /tmp/tmpff3I33/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpMoznYi execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:55.921 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "genisoimage -o /tmp/tmpff3I33/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpMoznYi" returned: 0 in 0.098s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:55.928 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpff3I33/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:59.528 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpff3I33/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.601s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:59.532 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:41:59.854 DEBUG oslo_concurrency.processutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.322s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:41:59.856 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:41:59.859 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:00.119 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:42:00.140 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:00.327 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:42:00.327 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:42:00.328 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:00.333 DEBUG oslo_concurrency.lockutils [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:00.334 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:00.517 DEBUG nova.virt.xenapi.vmops [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:00.634 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.775s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:00.641 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Destroying VBD for VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:42:00.642 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:00.650 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:15c132af-4d77-c7ef-b934-abdf5a064f59 for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:667169ad-23d0-6579-1bd3-d10e1efbd3e4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:00.652 DEBUG nova.objects.instance [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `pci_devices' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:00.704 DEBUG nova.compute.manager [req-741f75d6-a6f5-4153-8ef2-5641f0b9225d tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:00.778 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:00.981 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:00.982 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:00.982 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:00.990 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:00.991 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting hostname (RESCUE-tempest.common.compute-instance-1670852847) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:42:00.991 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:00.999 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:01.000 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:42:01.001 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:01.085 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:42:01.110 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.229 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_nwinfo" :: held 0.229s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:01.230 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.330 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:42:01.331 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:42:01.331 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:01.348 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "update_hostname" :: held 0.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:01.348 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.435 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:42:01.444 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:42:01.454 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Created VIF OpaqueRef:cb712a3e-10e6-4f97-c50f-92dd6aa4fed8, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:42:01.455 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.527 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 9.777s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:01.575 DEBUG nova.virt.xenapi.vmops [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.658 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Creating disk-type VBD for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:01.667 DEBUG nova.virt.xenapi.vm_utils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Created VBD OpaqueRef:db7aa99f-346f-6d5e-539a-5f0348f17fda for VM OpaqueRef:7a8f7dde-f8dc-4e4b-e5ab-d0038a58b987, VDI OpaqueRef:f6b24351-b041-bd93-89de-b6c3e6dd37ea. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:01.668 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:01.759 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:42:01.759 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:42:01.760 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:42:01.760 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:01.836 DEBUG nova.compute.manager [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:01.880 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:42:02.274 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:42:02.275 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:42:02.332 DEBUG oslo_concurrency.lockutils [req-eb0b6ce2-d9b3-4338-b96b-2e1b69b42496 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "_locked_do_build_and_run_instance" :: held 24.295s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:02.335 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:42:02.335 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.575s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:02.336 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:02.417 DEBUG oslo_concurrency.lockutils [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:02.418 DEBUG nova.compute.manager [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:02.444 DEBUG nova.compute.manager [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:42:02.451 DEBUG nova.virt.xenapi.vm_utils [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:42:03.802 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:03.845 INFO nova.compute.manager [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Starting instance... 2015-08-07 17:42:04.024 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:04.024 DEBUG nova.compute.resource_tracker [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:42:04.029 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:42:04.030 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:42:04.030 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:42:04.030 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:42:04.031 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] disk limit not specified, defaulting to unlimited 2015-08-07 17:42:04.052 DEBUG nova.compute.resources.vcpu [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:42:04.053 DEBUG nova.compute.resources.vcpu [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:42:04.053 INFO nova.compute.claims [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Claim successful 2015-08-07 17:42:04.549 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "instance_claim" :: held 0.526s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:04.970 DEBUG nova.compute.manager [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:05.018 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:05.164 DEBUG oslo_concurrency.lockutils [req-f0e01d83-d325-4037-9420-dbbc1483fcc4 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "do_stop_instance" :: held 2.747s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:05.206 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:05.272 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "update_usage" :: held 0.066s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:05.273 DEBUG nova.compute.utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:42:05.280 13318 DEBUG nova.compute.manager [-] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:42:05.281 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-7aa9beb6-cabe-4ae3-a172-162c757bd718" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:05.706 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:42:05.723 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:42:05.724 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:05.907 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:42:05.918 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:06.810 INFO nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Rebuilding instance 2015-08-07 17:42:06.823 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Cloned VDI OpaqueRef:c2cf3205-9bba-d402-f4b0-58665b7a66ab from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:42:06.885 DEBUG nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:07.069 INFO nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VM 2015-08-07 17:42:07.080 WARNING nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM already halted, skipping shutdown... 2015-08-07 17:42:07.094 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:42:07.110 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 333a4dac-64c1-4b58-8e5c-f4d87f85fa42 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:07.119 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI a1b2cdf8-88e8-4dcf-af69-ff470412174e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:07.170 13318 DEBUG nova.network.base_api [-] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:24:14', 'active': False, 'type': u'bridge', 'id': u'a4d0cab5-7166-47d3-95ce-8f90dde7ad9e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:07.204 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-7aa9beb6-cabe-4ae3-a172-162c757bd718" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:07.205 13318 DEBUG nova.compute.manager [-] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:24:14', 'active': False, 'type': u'bridge', 'id': u'a4d0cab5-7166-47d3-95ce-8f90dde7ad9e', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:42:07.610 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.692s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:07.611 INFO nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Image creation data, cacheable: True, downloaded: False duration: 1.70 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:42:07.823 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:42:07.837 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:42:08.197 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:42:08.226 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:42:08.227 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:08.358 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:08.426 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:42:08.438 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:08.564 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:08.753 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:42:08.766 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:42:08.766 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:08.980 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:c1a355b0-ba48-a0dd-c995-7b7f0cdf79a9, VDI OpaqueRef:c2cf3205-9bba-d402-f4b0-58665b7a66ab ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:08.987 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:aac1a30c-82e4-346f-054e-703b042bf946 for VM OpaqueRef:c1a355b0-ba48-a0dd-c995-7b7f0cdf79a9, VDI OpaqueRef:c2cf3205-9bba-d402-f4b0-58665b7a66ab. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:09.461 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:7106ad5e-039f-0608-5206-bd5d5371ba95 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:42:09.477 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:42:09.503 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:09.667 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:42:09.667 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:42:09.668 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:09.673 DEBUG oslo_concurrency.lockutils [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "xenstore-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:09.674 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:09.836 DEBUG nova.virt.xenapi.vmops [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:10.078 DEBUG nova.compute.manager [req-f81aa024-321e-4f7d-b859-f01359aa8db4 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:10.166 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.728s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:10.167 INFO nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 1.74 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:42:10.333 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:10.335 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 30.18 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:10.726 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:10.871 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:11.019 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:42:11.030 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:42:11.031 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:11.201 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:155ea89d-d45a-2717-1641-ba07c22f14b5, VDI OpaqueRef:7106ad5e-039f-0608-5206-bd5d5371ba95 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:11.211 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:270b3d5c-6034-7468-e8aa-61fb6a7ffe21 for VM OpaqueRef:155ea89d-d45a-2717-1641-ba07c22f14b5, VDI OpaqueRef:7106ad5e-039f-0608-5206-bd5d5371ba95. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:11.472 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:42:11.475 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:11.483 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:4cadc2fc-3200-2996-472e-4436593eff07 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:11.483 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:4cadc2fc-3200-2996-472e-4436593eff07 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:42:11.484 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:12.600 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.116s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:12.601 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:4cadc2fc-3200-2996-472e-4436593eff07 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:42:12.605 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:4cadc2fc-3200-2996-472e-4436593eff07 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:42:12.676 WARNING nova.virt.configdrive [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:42:12.677 DEBUG nova.objects.instance [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:12.703 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmp_ya6GQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRsNI_z execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:12.792 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmp_ya6GQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpRsNI_z" returned: 0 in 0.089s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:12.799 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp_ya6GQ/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:14.781 INFO nova.compute.manager [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Unrescuing 2015-08-07 17:42:14.784 DEBUG oslo_concurrency.lockutils [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Acquired semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:15.097 DEBUG nova.network.base_api [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:e9:e2', 'active': False, 'type': u'bridge', 'id': u'dd289fd2-27ae-4c1b-8c34-99f36829faef', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:15.103 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:15.128 DEBUG oslo_concurrency.lockutils [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Releasing semaphore "refresh_cache-31a2fd34-bbcb-4b50-83e0-dc6c7369b479" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:16.270 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp_ya6GQ/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.472s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:16.272 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:16.641 DEBUG oslo_concurrency.processutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.369s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:16.645 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:42:16.647 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:17.430 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.783s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:17.438 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:42:17.439 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:155ea89d-d45a-2717-1641-ba07c22f14b5, VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:17.449 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:322fdd17-0b3d-cc6e-47b5-f917368ee338 for VM OpaqueRef:155ea89d-d45a-2717-1641-ba07c22f14b5, VDI OpaqueRef:a3b37504-bc0f-7606-0712-24e90b26be40. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:17.451 DEBUG nova.objects.instance [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:17.578 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:17.791 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:17.802 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_meta" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:17.803 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:17.825 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_auto_disk_config" :: held 0.023s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:17.826 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting hostname (tempest.common.compute-instance-822491913) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:42:17.828 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:17.838 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:17.840 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:42:17.840 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:18.018 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_nwinfo" :: held 0.178s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:18.019 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:18.176 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:42:18.184 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:42:18.194 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VIF OpaqueRef:4a057216-60ea-5d66-945c-55b506411343, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:42:18.194 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:18.308 DEBUG nova.virt.xenapi.vm_utils [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI 811706a1-c927-4407-85ac-11aeb5cdf957 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:18.318 DEBUG nova.virt.xenapi.vm_utils [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI bd881f8d-1004-4f6d-b618-10266abbaf61 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:18.326 DEBUG nova.virt.xenapi.vm_utils [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI f2a8528d-921a-476d-9e2a-7625f31d91f7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:18.364 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:42:18.978 DEBUG nova.virt.xenapi.vmops [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:42:23.826 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:42:23.840 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:24.007 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:42:24.008 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:42:24.008 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:24.012 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:24.013 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:24.191 DEBUG nova.virt.xenapi.vmops [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:24.372 DEBUG nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:24.583 INFO nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] bringing vm to original state: 'stopped' 2015-08-07 17:42:24.591 DEBUG nova.compute.manager [req-b2ffa828-6b3c-434b-bd85-20f1f2dda251 tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:24.777 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:24.778 DEBUG nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:24.791 DEBUG nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:42:24.799 DEBUG nova.virt.xenapi.vm_utils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:42:25.008 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:26.601 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:26.602 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:26.602 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:26.605 INFO nova.compute.manager [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Terminating instance 2015-08-07 17:42:26.607 INFO nova.virt.xenapi.vmops [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Destroying VM 2015-08-07 17:42:26.616 DEBUG nova.virt.xenapi.vm_utils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:42:26.899 DEBUG nova.compute.manager [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:27.145 DEBUG oslo_concurrency.lockutils [req-6b2e613c-9663-4b25-b87b-95e6932d950a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "do_stop_instance" :: held 2.367s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:28.073 DEBUG oslo_concurrency.lockutils [req-7918f3b8-6292-4637-ab6c-229a3341dd90 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:28.219 DEBUG nova.network.base_api [req-7918f3b8-6292-4637-ab6c-229a3341dd90 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:28.243 DEBUG oslo_concurrency.lockutils [req-7918f3b8-6292-4637-ab6c-229a3341dd90 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:28.275 DEBUG nova.virt.xenapi.vmops [req-7918f3b8-6292-4637-ab6c-229a3341dd90 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:42:28.484 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:42:28.487 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:28.497 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:212cb9bd-083a-cc86-a53b-cc56a7a6ffcc for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:28.498 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Plugging VBD OpaqueRef:212cb9bd-083a-cc86-a53b-cc56a7a6ffcc ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:42:28.498 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:28.710 DEBUG nova.virt.xenapi.vmops [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:42:28.719 DEBUG nova.virt.xenapi.vm_utils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI f2a8528d-921a-476d-9e2a-7625f31d91f7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:28.725 DEBUG nova.virt.xenapi.vm_utils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] VDI 9de9040a-bacc-47f6-ab32-59126201ee9f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:29.519 DEBUG nova.virt.xenapi.vmops [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:42:29.536 DEBUG nova.virt.xenapi.vm_utils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:42:29.537 DEBUG nova.compute.manager [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:42:29.790 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.292s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:29.791 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Plugging VBD OpaqueRef:212cb9bd-083a-cc86-a53b-cc56a7a6ffcc done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:42:29.793 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VBD OpaqueRef:212cb9bd-083a-cc86-a53b-cc56a7a6ffcc plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:42:29.865 WARNING nova.virt.configdrive [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:42:29.874 DEBUG nova.objects.instance [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `ec2_ids' on Instance uuid 7aa9beb6-cabe-4ae3-a172-162c757bd718 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:29.906 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): genisoimage -o /tmp/tmp2TeqT0/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpapvn68 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:30.002 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "genisoimage -o /tmp/tmp2TeqT0/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpapvn68" returned: 0 in 0.096s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:30.009 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2TeqT0/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:31.347 DEBUG nova.compute.manager [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:38:55Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=52,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=31a2fd34-bbcb-4b50-83e0-dc6c7369b479,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:38:57Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:42:31.587 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:31.588 DEBUG nova.objects.instance [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lazy-loading `numa_topology' on Instance uuid 31a2fd34-bbcb-4b50-83e0-dc6c7369b479 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:31.799 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "compute_resources" released by "update_usage" :: held 0.212s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:32.376 DEBUG oslo_concurrency.lockutils [req-c968bccc-8b4f-4b25-a4be-e08f23c845fc tempest-ServerRescueTestJSON-507427870 tempest-ServerRescueTestJSON-897979640] Lock "31a2fd34-bbcb-4b50-83e0-dc6c7369b479" released by "do_terminate_instance" :: held 5.776s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:33.925 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2TeqT0/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.917s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:33.927 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:34.200 DEBUG oslo_concurrency.processutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.273s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:34.204 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Destroying VBD for VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:42:34.206 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:34.617 DEBUG nova.compute.manager [req-7918f3b8-6292-4637-ab6c-229a3341dd90 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:35.027 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:35.087 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.882s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:35.100 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Destroying VBD for VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:42:35.100 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Creating disk-type VBD for VM OpaqueRef:c1a355b0-ba48-a0dd-c995-7b7f0cdf79a9, VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:35.107 DEBUG nova.virt.xenapi.vm_utils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Created VBD OpaqueRef:7f7f8005-e8bf-5eca-ec6a-aaa2d5713eef for VM OpaqueRef:c1a355b0-ba48-a0dd-c995-7b7f0cdf79a9, VDI OpaqueRef:674bc40f-66b5-c17a-231d-99d8caff05c9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:35.108 DEBUG nova.objects.instance [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `pci_devices' on Instance uuid 7aa9beb6-cabe-4ae3-a172-162c757bd718 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:35.234 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:35.413 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:35.414 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:35.414 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:35.420 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "store_auto_disk_config" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:35.420 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Injecting hostname (tempest.common.compute-instance-312754636) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:42:35.421 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:35.426 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:35.427 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:42:35.427 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:35.594 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "update_nwinfo" :: held 0.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:35.595 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:35.782 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:42:35.788 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:42:35.794 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Created VIF OpaqueRef:6b6337b0-e7d1-af62-bdb2-419f2fa6b9cb, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:42:35.795 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:36.007 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:42:36.497 DEBUG nova.compute.manager [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:42:36.669 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:36.670 DEBUG nova.compute.resource_tracker [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:42:36.678 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:42:36.679 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:42:36.679 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:42:36.680 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:42:36.680 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] disk limit not specified, defaulting to unlimited 2015-08-07 17:42:36.702 DEBUG nova.compute.resources.vcpu [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:42:36.702 DEBUG nova.compute.resources.vcpu [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:42:36.703 INFO nova.compute.claims [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Claim successful 2015-08-07 17:42:36.737 INFO nova.compute.resource_tracker [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Updating from migration aa0c0819-08a6-4a79-93eb-f58885697f5b 2015-08-07 17:42:36.821 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "resize_claim" :: held 0.152s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:36.822 INFO nova.compute.manager [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrating 2015-08-07 17:42:36.889 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:37.021 DEBUG nova.network.base_api [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:37.048 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:37.318 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:37.508 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:42:37.524 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:37.525 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:42:38.035 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.511s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:38.043 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD c6f005a8-96a4-4151-b9d5-0836235b06a1 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.072 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD c6f005a8-96a4-4151-b9d5-0836235b06a1 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.077 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD fbdcd852-cf3f-4b24-a50f-1c9197ccc589 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.090 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 48721e3f-a85f-4341-a0a9-cb4c924dd00c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.103 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 60495381-4db4-4b06-83c7-7b5f778290c3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.115 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD f6006734-242b-4374-b2ac-eac20be70bf3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.123 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:38.131 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:42:38.367 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:38.417 INFO nova.compute.manager [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Starting instance... 2015-08-07 17:42:38.621 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:38.622 DEBUG nova.compute.resource_tracker [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:42:38.630 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:42:38.631 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:42:38.631 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:42:38.631 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:42:38.632 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] disk limit not specified, defaulting to unlimited 2015-08-07 17:42:38.655 DEBUG nova.compute.resources.vcpu [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:42:38.656 DEBUG nova.compute.resources.vcpu [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:42:38.656 INFO nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Claim successful 2015-08-07 17:42:38.948 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.326s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:39.120 DEBUG nova.compute.claims [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Aborting claim: [Claim: 69 MB memory, 0 GB disk] abort /opt/stack/new/nova/nova/compute/claims.py:130 2015-08-07 17:42:39.121 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "abort_instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:39.245 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "abort_instance_claim" :: held 0.124s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:39.246 DEBUG nova.compute.utils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Unexpected task state: expecting [None] but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:42:39.247 DEBUG nova.compute.manager [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:42:39.249 DEBUG nova.compute.manager [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:42:39.633 DEBUG oslo_concurrency.lockutils [req-f5b23244-f66a-4c41-a77a-6a5d5b72b5c1 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5" released by "_locked_do_build_and_run_instance" :: held 1.266s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:39.635 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5" acquired by "do_terminate_instance" :: waited 0.375s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:39.636 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:39.638 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:39.641 INFO nova.compute.manager [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Terminating instance 2015-08-07 17:42:39.642 INFO nova.virt.xenapi.vmops [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Destroying VM 2015-08-07 17:42:39.653 WARNING nova.virt.xenapi.vmops [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] VM is not present, skipping destroy... 2015-08-07 17:42:39.655 DEBUG nova.compute.manager [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:42:39.693 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:42:39.704 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:39.705 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:42:39.921 DEBUG nova.compute.manager [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:42:38Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name=None,device_type='disk',disk_bus=None,guest_format=None,id=58,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=e7e452a5-f0b3-4f79-9f03-57be081501e5,no_device=False,snapshot_id=None,source_type='image',updated_at=None,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:42:40.131 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:40.132 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:40.200 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.496s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:40.210 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 72d82594-6310-4a23-a0ba-7b20ddf62cbe has parent 14d44bf2-1b33-4c59-8810-23ba76a476d9 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:40.218 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 14d44bf2-1b33-4c59-8810-23ba76a476d9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:42:40.225 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:40.393 DEBUG oslo_concurrency.lockutils [req-1d3f34bc-1279-4538-95e4-15e4a3089e92 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "e7e452a5-f0b3-4f79-9f03-57be081501e5" released by "do_terminate_instance" :: held 0.758s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:40.427 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrating VHD '14d44bf2-1b33-4c59-8810-23ba76a476d9' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:42:40.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:40.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:40.681 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:42:41.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:41.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:42.043 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:42:42.044 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:42.300 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:42:42.307 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:42:42.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:42.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:42:42.791 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:42.945 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:40:04', 'active': False, 'type': u'bridge', 'id': u'05bc5db0-7a1d-4a7a-afed-5f08489f238b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:42.971 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:42:42.978 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:42.979 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:42:42.980 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:42.992 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:43.093 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:43.130 INFO nova.compute.manager [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Starting instance... 2015-08-07 17:42:43.216 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:42:43.217 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:42:43.217 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:43.222 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "xenstore-7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:43.223 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:43.337 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:43.338 DEBUG nova.compute.resource_tracker [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:42:43.343 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:42:43.344 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:42:43.344 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:42:43.345 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:42:43.345 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] disk limit not specified, defaulting to unlimited 2015-08-07 17:42:43.370 DEBUG nova.compute.resources.vcpu [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:42:43.371 DEBUG nova.compute.resources.vcpu [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:42:43.371 INFO nova.compute.claims [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Claim successful 2015-08-07 17:42:43.441 DEBUG nova.virt.xenapi.vmops [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:43.636 DEBUG nova.compute.manager [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:42:43.660 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.323s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:43.891 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:43.972 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:43.973 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.55 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:43.991 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.101s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:43.992 DEBUG nova.compute.utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:42:44.001 13318 DEBUG nova.compute.manager [-] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:42:44.002 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-65774979-e5fe-4d9f-9f8c-48214aed768d" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:44.005 DEBUG oslo_concurrency.lockutils [req-8d00a32a-9532-4621-a0a9-4cb87d0c98e3 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "_locked_do_build_and_run_instance" :: held 40.203s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:44.537 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:42:44.559 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:42:44.560 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:44.785 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:42:44.799 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:45.043 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:45.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:45.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:42:45.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:45.876 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:09e9709a-f641-86d8-d250-8d88592a8517 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:42:46.141 13318 DEBUG nova.network.base_api [-] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:d6:f0', 'active': False, 'type': u'bridge', 'id': u'155832fb-5f99-40d4-96a7-c052273920b5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:46.196 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-65774979-e5fe-4d9f-9f8c-48214aed768d" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:46.197 13318 DEBUG nova.compute.manager [-] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:24:d6:f0', 'active': False, 'type': u'bridge', 'id': u'155832fb-5f99-40d4-96a7-c052273920b5', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:42:46.387 DEBUG oslo_concurrency.lockutils [req-744f1833-b707-4be9-8b59-34cbeeda58d1 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "04dcd91a-004b-4803-90db-0a79720939ad" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:46.467 INFO nova.compute.manager [req-744f1833-b707-4be9-8b59-34cbeeda58d1 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 04dcd91a-004b-4803-90db-0a79720939ad] Starting instance... 2015-08-07 17:42:46.666 DEBUG nova.compute.manager [req-744f1833-b707-4be9-8b59-34cbeeda58d1 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 04dcd91a-004b-4803-90db-0a79720939ad] Unexpected task state: expecting [u'scheduling', None] but the actual state is deleting _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1871 2015-08-07 17:42:46.713 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.914s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:46.714 INFO nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 1.93 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:42:46.723 DEBUG oslo_concurrency.lockutils [req-744f1833-b707-4be9-8b59-34cbeeda58d1 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "04dcd91a-004b-4803-90db-0a79720939ad" released by "_locked_do_build_and_run_instance" :: held 0.336s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:47.377 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Migrating VHD 'c6f005a8-96a4-4151-b9d5-0836235b06a1' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:42:47.491 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:47.875 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:48.118 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:42:48.130 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:42:48.130 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:48.535 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:48.535 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:48.536 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:48.538 INFO nova.compute.manager [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Terminating instance 2015-08-07 17:42:48.540 INFO nova.virt.xenapi.vmops [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Destroying VM 2015-08-07 17:42:48.547 DEBUG nova.virt.xenapi.vm_utils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:42:48.559 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:48.624 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:cbcec8e5-b52d-334c-0412-be86ed5864b8, VDI OpaqueRef:09e9709a-f641-86d8-d250-8d88592a8517 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:48.634 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:a415e110-8ef9-93de-f8a7-648a303fda17 for VM OpaqueRef:cbcec8e5-b52d-334c-0412-be86ed5864b8, VDI OpaqueRef:09e9709a-f641-86d8-d250-8d88592a8517. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:49.018 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:42:49.038 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:49.039 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:49.039 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:49.041 INFO nova.compute.manager [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Terminating instance 2015-08-07 17:42:49.043 INFO nova.virt.xenapi.vmops [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Destroying VM 2015-08-07 17:42:49.053 DEBUG nova.virt.xenapi.vm_utils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:42:49.076 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:42:49.081 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:49.107 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:b6c457b4-5c52-5a42-f659-5db0a4a7e11e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:49.107 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:b6c457b4-5c52-5a42-f659-5db0a4a7e11e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:42:49.108 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:49.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:49.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:50.116 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:50.117 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:50.789 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.681s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:50.790 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:b6c457b4-5c52-5a42-f659-5db0a4a7e11e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:42:50.797 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:b6c457b4-5c52-5a42-f659-5db0a4a7e11e plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:42:50.894 WARNING nova.virt.configdrive [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:42:50.895 DEBUG nova.objects.instance [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 65774979-e5fe-4d9f-9f8c-48214aed768d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:50.933 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpcrkQSl/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpoopnmX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:51.040 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpcrkQSl/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpoopnmX" returned: 0 in 0.105s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:51.047 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpcrkQSl/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:52.028 DEBUG nova.virt.xenapi.vmops [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:42:52.034 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:42:52.042 DEBUG nova.virt.xenapi.vm_utils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VDI 48721e3f-a85f-4341-a0a9-cb4c924dd00c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:52.060 DEBUG nova.virt.xenapi.vm_utils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VDI a780103a-b463-4aad-af8a-3d1894116807 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:52.203 DEBUG nova.network.base_api [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:42:52.235 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:42:52.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:52.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:52.840 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:42:52.842 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:42:53.160 DEBUG nova.virt.xenapi.vmops [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:42:53.168 DEBUG nova.virt.xenapi.vm_utils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VDI fbdcd852-cf3f-4b24-a50f-1c9197ccc589 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:53.181 DEBUG nova.virt.xenapi.vm_utils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] VDI 780bed86-fe25-46ae-a379-314100b3414c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:42:53.364 DEBUG nova.virt.xenapi.vmops [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:42:53.380 DEBUG nova.virt.xenapi.vm_utils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:42:53.381 DEBUG nova.compute.manager [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:42:53.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:42:53.608 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:42:53.609 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:42:53.834 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:53.835 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:42:54.797 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.963s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:55.010 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:55.011 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:42:55.094 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:55.097 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:42:55.098 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:42:55.098 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:42:55.099 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:55.608 DEBUG nova.compute.manager [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:41:37Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=56,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=393149d8-bc0d-4f72-afbc-954d8344f5e5,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:41:38Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:42:55.702 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration aa0c0819-08a6-4a79-93eb-f58885697f5b 2015-08-07 17:42:55.703 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:55.806 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.796s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:55.882 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:42:55.883 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=1060MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:42:56.010 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:42:56.011 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.912s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:56.011 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "update_usage" :: waited 0.105s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:56.012 DEBUG nova.objects.instance [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `numa_topology' on Instance uuid 393149d8-bc0d-4f72-afbc-954d8344f5e5 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:56.014 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:42:56.113 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "update_usage" :: held 0.102s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:56.698 DEBUG oslo_concurrency.lockutils [req-81f65334-0669-48c4-a3b8-71ae81e26328 tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "393149d8-bc0d-4f72-afbc-954d8344f5e5" released by "do_terminate_instance" :: held 8.164s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:56.862 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:42:56.873 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:42:56.874 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:10811bbb-e877-1050-e1a8-6132b3944854, VDI OpaqueRef:804c7172-01bb-c016-c429-5b8726e657ef ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:56.882 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:540bdd9d-22b6-a616-1ec9-5cf8bcd7b93b for VM OpaqueRef:10811bbb-e877-1050-e1a8-6132b3944854, VDI OpaqueRef:804c7172-01bb-c016-c429-5b8726e657ef. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:57.330 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpcrkQSl/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.283s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:57.332 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:57.415 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:42:57.423 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:57.437 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:3678e895-80db-fa69-0365-b56d4e7f8400 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:57.438 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:3678e895-80db-fa69-0365-b56d4e7f8400 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:42:57.440 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:57.701 DEBUG oslo_concurrency.processutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.369s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:57.702 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:42:58.582 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.142s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:58.583 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:3678e895-80db-fa69-0365-b56d4e7f8400 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:42:58.584 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.881s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:42:58.586 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:3678e895-80db-fa69-0365-b56d4e7f8400 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:42:58.681 WARNING nova.virt.configdrive [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:42:58.682 DEBUG nova.objects.instance [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:58.722 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmptq0cVS/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPLZsmK execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:58.816 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmptq0cVS/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpPLZsmK" returned: 0 in 0.093s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:42:58.822 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmptq0cVS/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:42:59.392 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.807s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:42:59.401 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:42:59.402 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:cbcec8e5-b52d-334c-0412-be86ed5864b8, VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:42:59.411 DEBUG nova.virt.xenapi.vm_utils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:cb80b635-aecd-33f3-82ea-641d28f11015 for VM OpaqueRef:cbcec8e5-b52d-334c-0412-be86ed5864b8, VDI OpaqueRef:e8a04c2f-0cc4-3819-3653-f9f32827f79d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:42:59.413 DEBUG nova.objects.instance [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 65774979-e5fe-4d9f-9f8c-48214aed768d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:42:59.535 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:00.134 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:00.135 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:00.135 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:00.145 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:00.146 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Injecting hostname (tempest.common.compute-instance-268452849) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:43:00.146 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:00.159 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:00.160 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:43:00.160 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:00.362 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" released by "update_nwinfo" :: held 0.202s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:00.363 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:00.668 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:43:00.677 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:43:00.686 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Created VIF OpaqueRef:bb7072a6-dc64-5fdb-2400-856ef9a0797d, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:43:00.687 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:01.150 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:43:01.829 DEBUG nova.virt.xenapi.vmops [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:43:01.844 DEBUG nova.virt.xenapi.vm_utils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:43:01.845 DEBUG nova.compute.manager [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:43:03.718 DEBUG nova.compute.manager [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:42:03Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=57,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=7aa9beb6-cabe-4ae3-a172-162c757bd718,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:42:05Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:43:03.856 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmptq0cVS/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 5.035s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:03.859 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:03.954 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" acquired by "update_usage" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:03.957 DEBUG nova.objects.instance [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lazy-loading `numa_topology' on Instance uuid 7aa9beb6-cabe-4ae3-a172-162c757bd718 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:04.012 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:04.013 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:04.049 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "compute_resources" released by "update_usage" :: held 0.095s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:04.220 DEBUG oslo_concurrency.processutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.361s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:04.222 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:43:04.224 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:04.390 DEBUG oslo_concurrency.lockutils [req-6f645ff8-4174-477b-abf3-6b29a19f3b7a tempest-ListServersNegativeTestJSON-1486407346 tempest-ListServersNegativeTestJSON-2125822721] Lock "7aa9beb6-cabe-4ae3-a172-162c757bd718" released by "do_terminate_instance" :: held 15.352s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:05.057 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:05.712 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.488s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:05.718 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:43:05.719 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:10811bbb-e877-1050-e1a8-6132b3944854, VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:05.727 DEBUG nova.virt.xenapi.vm_utils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:4ba44ba3-fcbd-40ae-008d-fd29d1856c9a for VM OpaqueRef:10811bbb-e877-1050-e1a8-6132b3944854, VDI OpaqueRef:6da66aaf-ce0c-4407-bd66-d18138f235e9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:05.728 DEBUG nova.objects.instance [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:05.848 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:05.858 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_meta" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:05.859 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:05.867 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:05.868 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:43:05.868 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:06.067 DEBUG oslo_concurrency.lockutils [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "update_nwinfo" :: held 0.199s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:06.068 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:43:06.075 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:43:06.084 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Created VIF OpaqueRef:92a32dac-1b8f-d6be-01e8-a0dab1917c12, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:43:06.085 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:43:09.525 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:43:09.547 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:09.877 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:43:09.878 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:43:09.879 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:09.885 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-65774979-e5fe-4d9f-9f8c-48214aed768d" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:09.886 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:10.355 DEBUG nova.virt.xenapi.vmops [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:10.545 DEBUG nova.compute.manager [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:43:11.145 DEBUG oslo_concurrency.lockutils [req-3c332db7-425b-4689-906d-c698a1a9c654 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d" released by "_locked_do_build_and_run_instance" :: held 28.051s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:12.329 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:12.330 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:12.331 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:12.333 INFO nova.compute.manager [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Terminating instance 2015-08-07 17:43:12.335 INFO nova.virt.xenapi.vmops [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Destroying VM 2015-08-07 17:43:12.367 DEBUG nova.virt.xenapi.vm_utils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:43:12.655 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:43:12.671 DEBUG nova.virt.xenapi.vmops [req-0a9ceeae-45f6-411e-b84e-54286c273f21 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:13.636 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:13.637 DEBUG nova.compute.manager [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Going to confirm migration 5 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:43:15.002 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:43:15.064 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:15.185 DEBUG nova.network.base_api [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:d0:30:ae', 'active': False, 'type': u'bridge', 'id': u'f50b3fec-3100-4377-bded-e2cd8fa38482', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:43:15.211 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-aa0c0819-08a6-4a79-93eb-f58885697f5b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:43:15.218 WARNING nova.virt.xenapi.vm_utils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM already halted, skipping shutdown... 2015-08-07 17:43:15.232 DEBUG nova.virt.xenapi.vmops [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:43:15.243 DEBUG nova.virt.xenapi.vm_utils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI c6f005a8-96a4-4151-b9d5-0836235b06a1 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:15.255 DEBUG nova.virt.xenapi.vm_utils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 45656525-8c08-4390-906b-b940c2d637eb is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:15.433 DEBUG nova.virt.xenapi.vmops [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:43:15.445 DEBUG nova.virt.xenapi.vm_utils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 976addd3-e1e8-440c-8b39-f78569376a8f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:15.453 DEBUG nova.virt.xenapi.vm_utils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 225c8784-8e5a-4430-989b-d4d4151ee716 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:15.991 DEBUG nova.virt.xenapi.vmops [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:43:16.005 DEBUG nova.virt.xenapi.vm_utils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:43:16.071 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:16.142 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "drop_move_claim" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:16.577 DEBUG oslo_concurrency.lockutils [req-3dc8e0aa-1d1e-46c4-a733-eb4186cf38dc tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "do_confirm_resize" :: held 2.941s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:18.115 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:18.116 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:18.116 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:18.118 INFO nova.compute.manager [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Terminating instance 2015-08-07 17:43:18.120 INFO nova.virt.xenapi.vmops [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VM 2015-08-07 17:43:18.129 DEBUG nova.virt.xenapi.vm_utils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:43:19.092 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:19.144 INFO nova.compute.manager [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Starting instance... 2015-08-07 17:43:19.368 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:19.369 DEBUG nova.compute.resource_tracker [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:43:19.377 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:43:19.378 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Total memory: 8187 MB, used: 853.00 MB 2015-08-07 17:43:19.380 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] memory limit: 12280.50 MB, free: 11427.50 MB 2015-08-07 17:43:19.380 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:43:19.380 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] disk limit not specified, defaulting to unlimited 2015-08-07 17:43:19.409 DEBUG nova.compute.resources.vcpu [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:43:19.410 DEBUG nova.compute.resources.vcpu [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:43:19.410 INFO nova.compute.claims [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Claim successful 2015-08-07 17:43:19.726 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" released by "instance_claim" :: held 0.358s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:19.936 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:20.024 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" released by "update_usage" :: held 0.088s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:20.025 DEBUG nova.compute.utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:43:20.030 13318 DEBUG nova.compute.manager [-] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:43:20.031 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-4443fc1d-35f7-46c6-8139-e22c74dd9d86" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:43:20.432 DEBUG nova.virt.xenapi.vmops [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:43:20.445 DEBUG nova.virt.xenapi.vm_utils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI c743ab87-7975-485a-803e-2c9f0840bdbd is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:20.454 DEBUG nova.virt.xenapi.vm_utils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 48936524-e3d3-472d-9e27-95b4040e109e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:43:20.535 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:43:20.554 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:43:20.555 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:20.794 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:43:20.806 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:21.146 DEBUG nova.virt.xenapi.vmops [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:43:21.159 DEBUG nova.virt.xenapi.vm_utils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:43:21.159 DEBUG nova.compute.manager [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:43:22.020 13318 DEBUG nova.network.base_api [-] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c4:05:c4', 'active': False, 'type': u'bridge', 'id': u'49d4f597-93e6-4729-aead-23052845cc31', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:43:22.051 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-4443fc1d-35f7-46c6-8139-e22c74dd9d86" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:43:22.052 13318 DEBUG nova.compute.manager [-] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c4:05:c4', 'active': False, 'type': u'bridge', 'id': u'49d4f597-93e6-4729-aead-23052845cc31', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:43:22.964 DEBUG nova.compute.manager [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:37:34Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=48,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=aa0c0819-08a6-4a79-93eb-f58885697f5b,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:37:36Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:43:23.124 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:23.125 DEBUG nova.objects.instance [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid aa0c0819-08a6-4a79-93eb-f58885697f5b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:23.201 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.077s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:23.471 DEBUG oslo_concurrency.lockutils [req-7b0135e1-ce70-4c92-8b7f-aa13126defd5 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "aa0c0819-08a6-4a79-93eb-f58885697f5b" released by "do_terminate_instance" :: held 5.356s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:25.024 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:26.231 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:26.266 INFO nova.compute.manager [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Starting instance... 2015-08-07 17:43:26.452 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:26.453 DEBUG nova.compute.resource_tracker [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:43:26.460 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:43:26.460 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:43:26.461 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:43:26.461 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:43:26.463 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] disk limit not specified, defaulting to unlimited 2015-08-07 17:43:26.483 DEBUG nova.compute.resources.vcpu [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:43:26.484 DEBUG nova.compute.resources.vcpu [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:43:26.484 INFO nova.compute.claims [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Claim successful 2015-08-07 17:43:26.739 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.287s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:26.923 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:27.002 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.080s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:27.003 DEBUG nova.compute.utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:43:27.007 13318 DEBUG nova.compute.manager [-] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:43:27.008 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:43:27.484 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:43:27.499 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:43:27.502 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:27.722 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:43:28.887 13318 DEBUG nova.network.base_api [-] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:43:28.914 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:43:28.915 13318 DEBUG nova.compute.manager [-] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:43:31.519 DEBUG nova.virt.xenapi.vmops [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:43:31.532 DEBUG nova.virt.xenapi.vm_utils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:43:31.533 DEBUG nova.compute.manager [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:43:32.981 DEBUG nova.compute.manager [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:42:42Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=59,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=65774979-e5fe-4d9f-9f8c-48214aed768d,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:42:44Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:43:33.166 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:33.168 DEBUG nova.objects.instance [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 65774979-e5fe-4d9f-9f8c-48214aed768d obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:33.252 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.086s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:33.531 DEBUG oslo_concurrency.lockutils [req-29f8549c-aa30-4dbc-a88a-db9056ddcb40 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "65774979-e5fe-4d9f-9f8c-48214aed768d" released by "do_terminate_instance" :: held 21.202s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:35.117 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:38.583 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:38.661 INFO nova.compute.manager [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Starting instance... 2015-08-07 17:43:38.911 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:38.912 DEBUG nova.compute.resource_tracker [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:43:38.920 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:43:38.921 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:43:38.921 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:43:38.922 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:43:38.922 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] disk limit not specified, defaulting to unlimited 2015-08-07 17:43:38.948 DEBUG nova.compute.resources.vcpu [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:43:38.948 DEBUG nova.compute.resources.vcpu [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:43:38.949 INFO nova.compute.claims [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Claim successful 2015-08-07 17:43:39.436 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.524s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:39.652 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:39.763 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.111s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:39.764 DEBUG nova.compute.utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:43:39.770 13318 DEBUG nova.compute.manager [-] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:43:39.771 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-6c2e3b38-f3d2-472d-a728-f41167dcf407" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:43:40.559 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:43:40.575 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:43:40.576 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:40.835 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:43:41.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:41.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:41.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:41.813 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Cloned VDI OpaqueRef:7763db67-ff28-ab5b-4b66-e9b59cabaf4d from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:43:42.178 13318 DEBUG nova.network.base_api [-] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7b:8e:8a', 'active': False, 'type': u'bridge', 'id': u'a60efa36-ab13-47a2-a5b1-26546735869d', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:43:42.215 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-6c2e3b38-f3d2-472d-a728-f41167dcf407" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:43:42.215 13318 DEBUG nova.compute.manager [-] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7b:8e:8a', 'active': False, 'type': u'bridge', 'id': u'a60efa36-ab13-47a2-a5b1-26546735869d', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:43:42.512 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 21.706s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:42.513 INFO nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Image creation data, cacheable: True, downloaded: False duration: 21.72 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:43:42.514 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 14.782s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:43.541 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:43.746 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:f5e4114a-76cc-59db-146b-87cc2c5f941b from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:43:43.782 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:44.048 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:43:44.061 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:43:44.062 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:44.323 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Creating disk-type VBD for VM OpaqueRef:4d340ae4-16b3-8c63-02e5-0a594ad5b0e3, VDI OpaqueRef:7763db67-ff28-ab5b-4b66-e9b59cabaf4d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:44.332 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Created VBD OpaqueRef:bb448fa6-cf89-0dff-ea95-b2957f52ce99 for VM OpaqueRef:4d340ae4-16b3-8c63-02e5-0a594ad5b0e3, VDI OpaqueRef:7763db67-ff28-ab5b-4b66-e9b59cabaf4d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:44.509 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.994s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:44.509 INFO nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 16.79 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:43:44.510 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 3.664s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:44.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:44.530 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:43:44.673 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:43:44.758 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Created VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:43:44.764 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:44.777 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Created VBD OpaqueRef:dca3cb61-3c0e-ad64-94f4-c816bd4b8d2c for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:44.778 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Plugging VBD OpaqueRef:dca3cb61-3c0e-ad64-94f4-c816bd4b8d2c ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:43:44.778 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:44.835 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:14:49:68', 'active': False, 'type': u'bridge', 'id': u'e63eea3a-8bbb-438c-b0a8-394f201769a5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:43:44.869 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:43:44.891 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:43:44.892 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:45.051 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:45.328 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:45.862 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:45.876 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:45.877 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:46.148 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:43:46.165 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:43:46.166 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:46.405 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:16923d88-4c43-f4b2-bbea-5b3efca2ebd1, VDI OpaqueRef:f5e4114a-76cc-59db-146b-87cc2c5f941b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:46.418 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:f6d18438-950b-c4fa-7406-9709be49d041 for VM OpaqueRef:16923d88-4c43-f4b2-bbea-5b3efca2ebd1, VDI OpaqueRef:f5e4114a-76cc-59db-146b-87cc2c5f941b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:46.459 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.681s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:46.460 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Plugging VBD OpaqueRef:dca3cb61-3c0e-ad64-94f4-c816bd4b8d2c done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:43:46.464 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] VBD OpaqueRef:dca3cb61-3c0e-ad64-94f4-c816bd4b8d2c plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:43:46.551 WARNING nova.virt.configdrive [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:43:46.552 DEBUG nova.objects.instance [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lazy-loading `ec2_ids' on Instance uuid 4443fc1d-35f7-46c6-8139-e22c74dd9d86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:46.590 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Running cmd (subprocess): genisoimage -o /tmp/tmpVmf2I_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpH6NgCC execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:46.734 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] CMD "genisoimage -o /tmp/tmpVmf2I_/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpH6NgCC" returned: 0 in 0.143s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:46.743 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVmf2I_/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:46.859 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:43:46.867 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:46.880 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:86a3e305-03b4-a11a-904a-08c173499a3d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:46.881 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:86a3e305-03b4-a11a-904a-08c173499a3d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:43:46.882 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:47.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:47.524 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:43:47.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:48.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:48.625 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:48.660 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.778s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:48.660 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:86a3e305-03b4-a11a-904a-08c173499a3d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:43:48.665 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:86a3e305-03b4-a11a-904a-08c173499a3d plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:43:48.760 WARNING nova.virt.configdrive [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:43:48.761 DEBUG nova.objects.instance [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:48.803 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmp4oTUHh/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpeFsFqX execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:48.942 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmp4oTUHh/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpeFsFqX" returned: 0 in 0.137s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:48.947 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4oTUHh/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:50.620 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:50.623 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:52.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:53.008 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:54.008 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:43:54.060 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:43:54.062 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:43:54.281 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:54.282 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:43:55.175 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:55.484 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.203s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:55.746 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:43:55.747 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:43:55.748 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:43:55.749 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:56.150 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:43:56.151 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:43:56.310 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:43:56.310 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.562s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:56.311 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.51 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:43:57.174 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpVmf2I_/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.431s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:57.176 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:57.840 DEBUG oslo_concurrency.processutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.663s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:57.843 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Destroying VBD for VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:43:57.844 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:58.893 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.049s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:58.901 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Destroying VBD for VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:43:58.902 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Creating disk-type VBD for VM OpaqueRef:4d340ae4-16b3-8c63-02e5-0a594ad5b0e3, VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:43:58.912 DEBUG nova.virt.xenapi.vm_utils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Created VBD OpaqueRef:046801bf-2899-9e4c-870b-731dc5207705 for VM OpaqueRef:4d340ae4-16b3-8c63-02e5-0a594ad5b0e3, VDI OpaqueRef:18ea826c-baec-dfd1-c2eb-23f7f1d6b1fd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:43:58.913 DEBUG nova.objects.instance [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lazy-loading `pci_devices' on Instance uuid 4443fc1d-35f7-46c6-8139-e22c74dd9d86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:43:59.051 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:59.412 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:59.412 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:59.413 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:59.423 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:59.424 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Injecting hostname (tempest.common.compute-instance-1334271492) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:43:59.424 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:59.433 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:59.434 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:43:59.436 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "update_nwinfo" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:59.555 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp4oTUHh/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 10.608s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:59.558 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:43:59.645 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "update_nwinfo" :: held 0.209s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:43:59.649 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:43:59.863 DEBUG oslo_concurrency.processutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.306s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:43:59.864 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:43:59.865 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:43:59.920 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:43:59.939 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:43:59.951 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Created VIF OpaqueRef:7f28fc7b-11c4-9b8a-f75c-e2ac249371af, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:43:59.952 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:00.171 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:00.629 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.763s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:00.636 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:44:00.637 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:16923d88-4c43-f4b2-bbea-5b3efca2ebd1, VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:00.646 DEBUG nova.virt.xenapi.vm_utils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:4e3ab218-5aff-469a-51f1-998b85bee0a5 for VM OpaqueRef:16923d88-4c43-f4b2-bbea-5b3efca2ebd1, VDI OpaqueRef:ee003b5d-ffbb-2082-832b-bc6991ce472b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:00.649 DEBUG nova.objects.instance [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:00.773 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:01.045 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:01.045 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:01.046 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:01.058 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "store_auto_disk_config" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:01.059 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Injecting hostname (tempest.common.compute-instance-354169416) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:44:01.060 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:01.067 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:01.068 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:44:01.068 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:01.250 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "update_nwinfo" :: held 0.181s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:01.250 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:01.509 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:44:01.527 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:44:01.546 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Created VIF OpaqueRef:e3bf69a5-7428-87dd-c35c-2fe3916e535d, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:44:01.547 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:01.858 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:02.844 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:1cde17ba-9010-a675-a19c-fa5bca372c57 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:44:03.743 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 19.233s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:03.745 INFO nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 22.91 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:44:03.824 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:03.825 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:04.538 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:04.762 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:04.997 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:05.012 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:05.017 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:05.041 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:05.251 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:d171bd1a-d3c7-feac-e7db-2b33c4496e73, VDI OpaqueRef:1cde17ba-9010-a675-a19c-fa5bca372c57 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:05.261 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:0290cdb3-4ae7-7cd8-99c2-34db5933f179 for VM OpaqueRef:d171bd1a-d3c7-feac-e7db-2b33c4496e73, VDI OpaqueRef:1cde17ba-9010-a675-a19c-fa5bca372c57. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:05.650 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:44:05.654 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:05.667 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:3f4c2143-8e39-aa41-6910-dd832b1a1678 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:05.667 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:3f4c2143-8e39-aa41-6910-dd832b1a1678 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:44:05.668 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:07.414 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:07.472 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:07.605 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.937s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:07.606 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:3f4c2143-8e39-aa41-6910-dd832b1a1678 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:44:07.610 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:3f4c2143-8e39-aa41-6910-dd832b1a1678 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:44:07.709 WARNING nova.virt.configdrive [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:44:07.710 DEBUG nova.objects.instance [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 6c2e3b38-f3d2-472d-a728-f41167dcf407 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:07.770 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:07.770 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:07.771 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:07.778 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "xenstore-4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:07.779 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:07.912 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpa48Dij/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpBKCyhD execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:08.033 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpa48Dij/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpBKCyhD" returned: 0 in 0.120s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:08.040 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpa48Dij/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:08.148 DEBUG nova.virt.xenapi.vmops [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:08.591 DEBUG nova.compute.manager [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:09.204 DEBUG oslo_concurrency.lockutils [req-fed14245-d06d-4088-8e1b-33ee9146eb08 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "_locked_do_build_and_run_instance" :: held 50.112s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:09.391 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:09.410 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:09.779 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:09.780 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:09.781 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:09.788 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:09.789 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:10.219 DEBUG nova.virt.xenapi.vmops [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:10.515 DEBUG nova.compute.manager [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:10.747 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:10.748 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:10.748 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:10.751 INFO nova.compute.manager [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Terminating instance 2015-08-07 17:44:10.753 INFO nova.virt.xenapi.vmops [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Destroying VM 2015-08-07 17:44:10.769 DEBUG nova.virt.xenapi.vm_utils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:44:11.263 DEBUG oslo_concurrency.lockutils [req-680b1692-b42c-440c-8291-60b431b31d1a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" released by "_locked_do_build_and_run_instance" :: held 45.032s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:11.736 DEBUG oslo_concurrency.lockutils [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:11.737 DEBUG nova.compute.manager [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:11.835 DEBUG nova.compute.manager [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:44:11.843 DEBUG nova.virt.xenapi.vm_utils [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:44:14.631 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpa48Dij/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.591s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:14.633 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:14.767 DEBUG nova.virt.xenapi.vmops [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:44:14.789 DEBUG nova.virt.xenapi.vm_utils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] VDI 038f02d2-fd73-4463-8a21-90c1fa79c768 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:44:14.800 DEBUG nova.virt.xenapi.vm_utils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] VDI dc24f783-1cd5-43a6-ad15-01b1cc3f625f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:44:15.040 DEBUG oslo_concurrency.processutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.407s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:15.041 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:44:15.042 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:15.068 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:15.664 DEBUG nova.virt.xenapi.vmops [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:44:15.678 DEBUG nova.virt.xenapi.vm_utils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:44:15.679 DEBUG nova.compute.manager [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:44:15.819 DEBUG nova.compute.manager [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:15.927 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.884s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:15.942 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:44:15.942 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:d171bd1a-d3c7-feac-e7db-2b33c4496e73, VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:15.953 DEBUG nova.virt.xenapi.vm_utils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:61cdb25f-d5de-448c-c91c-47986af93189 for VM OpaqueRef:d171bd1a-d3c7-feac-e7db-2b33c4496e73, VDI OpaqueRef:f9a68090-7cbd-9e44-16fd-5d280bb25a4f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:15.954 DEBUG nova.objects.instance [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 6c2e3b38-f3d2-472d-a728-f41167dcf407 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:16.051 DEBUG oslo_concurrency.lockutils [req-4f43722e-5303-4776-96e5-c9305ca4d668 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" released by "do_stop_instance" :: held 4.315s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:16.073 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:16.547 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:16.548 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:16.549 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:16.557 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:16.558 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Injecting hostname (tempest.common.compute-instance-1921240456) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:44:16.558 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:16.566 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:16.566 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:44:16.567 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:16.740 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "update_nwinfo" :: held 0.173s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:16.741 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:17.166 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:44:17.173 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:44:17.182 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Created VIF OpaqueRef:daaa0bb2-2108-f240-36eb-78810356a50b, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:44:17.183 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:17.379 DEBUG nova.compute.manager [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:43:18Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=61,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=4443fc1d-35f7-46c6-8139-e22c74dd9d86,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:43:20Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:44:17.425 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:17.455 DEBUG nova.compute.manager [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Stashing vm_state: stopped _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:44:17.581 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:17.583 DEBUG nova.objects.instance [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lazy-loading `numa_topology' on Instance uuid 4443fc1d-35f7-46c6-8139-e22c74dd9d86 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:17.670 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "compute_resources" released by "update_usage" :: held 0.088s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:17.676 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "resize_claim" :: waited 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:17.677 DEBUG nova.compute.resource_tracker [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:44:17.687 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:44:17.688 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:44:17.688 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:44:17.689 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:44:17.689 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] disk limit not specified, defaulting to unlimited 2015-08-07 17:44:17.715 DEBUG nova.compute.resources.vcpu [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:44:17.716 DEBUG nova.compute.resources.vcpu [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:44:17.716 INFO nova.compute.claims [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Claim successful 2015-08-07 17:44:17.773 INFO nova.compute.resource_tracker [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Updating from migration 115845a8-7055-44f5-a3f5-222bd027f1ed 2015-08-07 17:44:17.879 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "resize_claim" :: held 0.204s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:17.880 INFO nova.compute.manager [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrating 2015-08-07 17:44:17.960 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:18.036 DEBUG oslo_concurrency.lockutils [req-d9026024-1f7d-445d-acf4-5c3b47d946b1 tempest-ServerPasswordTestJSON-92181681 tempest-ServerPasswordTestJSON-1938394173] Lock "4443fc1d-35f7-46c6-8139-e22c74dd9d86" released by "do_terminate_instance" :: held 7.289s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:18.130 DEBUG nova.network.base_api [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:18.156 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:18.454 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:18.700 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:44:18.727 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:18.728 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:44:19.239 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.511s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:19.247 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 98092af1-40ef-46f7-aa3b-2582b5c31cd7 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.269 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 9165e814-0a6c-4933-8fe4-c9b0bbbe1863 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.282 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 98092af1-40ef-46f7-aa3b-2582b5c31cd7 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.295 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 60495381-4db4-4b06-83c7-7b5f778290c3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.305 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD f6006734-242b-4374-b2ac-eac20be70bf3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.316 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:19.326 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:44:20.472 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:44:20.484 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:20.485 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:44:20.914 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.431s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:20.934 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD a15da28f-6cb6-48b2-bdb9-6733805b05cd has parent 46b9c885-fd98-47c6-9835-1280c62d959d _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:20.942 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 46b9c885-fd98-47c6-9835-1280c62d959d has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:44:20.950 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:21.166 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrating VHD '46b9c885-fd98-47c6-9835-1280c62d959d' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:44:21.499 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:44:22.736 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:44:22.737 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:22.990 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:44:22.995 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] VM was already shutdown. _resize_ensure_vm_is_shutdown /opt/stack/new/nova/nova/virt/xenapi/vmops.py:924 2015-08-07 17:44:22.996 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Migrating VHD '98092af1-40ef-46f7-aa3b-2582b5c31cd7' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:44:23.258 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:23.553 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:23.642 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:23.662 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:23.972 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:23.973 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:23.974 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:23.979 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:23.980 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:24.196 DEBUG nova.virt.xenapi.vmops [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:24.434 DEBUG nova.compute.manager [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:24.544 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:24.545 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:24.833 DEBUG oslo_concurrency.lockutils [req-470c2303-439c-4f1a-b416-d67799166bd6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "_locked_do_build_and_run_instance" :: held 46.250s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:24.981 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:25.038 INFO nova.compute.manager [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance... 2015-08-07 17:44:25.049 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:25.252 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:25.253 DEBUG nova.compute.resource_tracker [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:44:25.262 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:44:25.263 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Total memory: 8187 MB, used: 922.00 MB 2015-08-07 17:44:25.263 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] memory limit: 12280.50 MB, free: 11358.50 MB 2015-08-07 17:44:25.264 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:44:25.265 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] disk limit not specified, defaulting to unlimited 2015-08-07 17:44:25.290 DEBUG nova.compute.resources.vcpu [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:44:25.291 DEBUG nova.compute.resources.vcpu [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:44:25.291 INFO nova.compute.claims [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Claim successful 2015-08-07 17:44:25.617 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "instance_claim" :: held 0.365s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:25.891 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:26.009 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "update_usage" :: held 0.117s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:26.010 DEBUG nova.compute.utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:44:26.015 13318 DEBUG nova.compute.manager [-] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:44:26.016 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:26.633 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:44:26.643 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:26.655 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:44:26.660 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:26.778 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:26.779 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:26.780 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:26.781 INFO nova.compute.manager [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Terminating instance 2015-08-07 17:44:26.783 INFO nova.virt.xenapi.vmops [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Destroying VM 2015-08-07 17:44:26.794 DEBUG nova.virt.xenapi.vm_utils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:44:26.949 DEBUG nova.network.base_api [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:26.988 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:44:26.992 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:27.010 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:27.309 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:44:27.310 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:44:28.426 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:28.481 INFO nova.compute.manager [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Starting instance... 2015-08-07 17:44:28.615 13318 DEBUG nova.network.base_api [-] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:28.632 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:28.632 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:44:28.652 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:28.653 13318 DEBUG nova.compute.manager [-] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:44:28.703 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:28.703 DEBUG nova.compute.resource_tracker [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:44:28.712 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:44:28.713 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:44:28.713 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:44:28.714 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:44:28.714 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] disk limit not specified, defaulting to unlimited 2015-08-07 17:44:28.743 DEBUG nova.compute.resources.vcpu [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:44:28.743 DEBUG nova.compute.resources.vcpu [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:44:28.744 INFO nova.compute.claims [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Claim successful 2015-08-07 17:44:28.942 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Cloned VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:44:29.024 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "instance_claim" :: held 0.321s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:29.208 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:29.284 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "update_usage" :: held 0.076s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:29.285 DEBUG nova.compute.utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:44:29.290 13318 DEBUG nova.compute.manager [-] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:44:29.291 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-b4eac7df-4935-4b77-8307-8c8cabe2c038" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:29.605 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.595s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:29.606 INFO nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Image creation data, cacheable: True, downloaded: False duration: 2.62 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:44:29.828 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:44:29.853 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:44:29.854 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:30.057 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:44:30.067 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:30.224 DEBUG nova.virt.xenapi.vmops [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:44:30.240 DEBUG nova.virt.xenapi.vm_utils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 9165e814-0a6c-4933-8fe4-c9b0bbbe1863 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:44:30.256 DEBUG nova.virt.xenapi.vm_utils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 49119a21-3e70-488f-9e16-86a735fed31d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:44:30.356 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:30.573 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:30.822 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:30.846 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:30.847 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:31.071 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Cloned VDI OpaqueRef:fbaac169-cc11-209d-c49d-96e2a6f51202 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:44:31.121 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:31.130 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:2057178f-1d57-1613-8f18-26b1bd4b2009 for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:31.319 13318 DEBUG nova.network.base_api [-] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ae:e2:5a', 'active': False, 'type': u'bridge', 'id': u'f9d7f841-808e-40fe-99f2-c904249e4655', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:31.350 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-b4eac7df-4935-4b77-8307-8c8cabe2c038" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:31.351 13318 DEBUG nova.compute.manager [-] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ae:e2:5a', 'active': False, 'type': u'bridge', 'id': u'f9d7f841-808e-40fe-99f2-c904249e4655', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:44:31.461 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:44:31.464 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:31.475 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:1f03c41c-7467-a6b0-6587-053d41be32b7 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:31.476 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:1f03c41c-7467-a6b0-6587-053d41be32b7 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:44:31.477 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:31.670 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.602s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:31.670 INFO nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Image creation data, cacheable: True, downloaded: False duration: 1.61 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:44:32.266 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:32.505 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:32.652 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.175s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:32.652 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:1f03c41c-7467-a6b0-6587-053d41be32b7 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:44:32.656 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VBD OpaqueRef:1f03c41c-7467-a6b0-6587-053d41be32b7 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:44:32.714 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:32.728 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:32.728 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:32.742 WARNING nova.virt.configdrive [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:44:32.743 DEBUG nova.objects.instance [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `ec2_ids' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:32.779 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): genisoimage -o /tmp/tmpFJ_8Qe/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpkthfSB execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:32.878 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "genisoimage -o /tmp/tmpFJ_8Qe/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpkthfSB" returned: 0 in 0.099s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:32.884 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpFJ_8Qe/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:33.257 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:9817f64c-00e7-ff9b-d216-5d5113314224, VDI OpaqueRef:fbaac169-cc11-209d-c49d-96e2a6f51202 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:33.271 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:44eb8696-1fa9-5125-6f88-9fd62a58526d for VM OpaqueRef:9817f64c-00e7-ff9b-d216-5d5113314224, VDI OpaqueRef:fbaac169-cc11-209d-c49d-96e2a6f51202. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:33.831 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:44:33.834 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:33.853 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:8d370cd0-1b6f-7e5c-6132-646661f8728b for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:33.858 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:8d370cd0-1b6f-7e5c-6132-646661f8728b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:44:33.859 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:35.111 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:35.489 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.630s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:35.490 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:8d370cd0-1b6f-7e5c-6132-646661f8728b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:44:35.493 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VBD OpaqueRef:8d370cd0-1b6f-7e5c-6132-646661f8728b plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:44:35.581 WARNING nova.virt.configdrive [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:44:35.582 DEBUG nova.objects.instance [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `ec2_ids' on Instance uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:35.624 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): genisoimage -o /tmp/tmpPFsGvQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpWm6bTV execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:35.728 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "genisoimage -o /tmp/tmpPFsGvQ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpWm6bTV" returned: 0 in 0.104s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:35.734 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpPFsGvQ/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:37.410 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpFJ_8Qe/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 4.526s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:37.414 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:37.805 DEBUG oslo_concurrency.processutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.390s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:37.808 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:44:37.809 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:39.086 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.277s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:39.094 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:44:39.095 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:39.105 DEBUG nova.virt.xenapi.vm_utils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:522feb5d-8f34-3cef-b113-132c1a03ad2f for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:249a65cd-90b6-f6a4-1a82-e543ed1d95f8. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:39.106 DEBUG nova.objects.instance [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `pci_devices' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:39.225 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:39.420 DEBUG nova.virt.xenapi.vmops [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:44:39.432 DEBUG nova.virt.xenapi.vm_utils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:44:39.433 DEBUG nova.compute.manager [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:44:39.553 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:39.553 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:39.554 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:39.562 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:39.563 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting hostname (tempest.common.compute-instance-1031488919) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:44:39.563 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:39.569 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:39.570 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:44:39.571 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:39.729 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_nwinfo" :: held 0.158s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:39.729 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:40.016 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:44:40.022 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:44:40.029 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VIF OpaqueRef:f3744f78-e9cd-6c0f-2502-1c22d7c9f172, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:44:40.030 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:40.224 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:40.289 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpPFsGvQ/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 4.555s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:40.299 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:40.575 DEBUG oslo_concurrency.processutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.275s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:40.578 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:44:40.580 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:40.913 DEBUG nova.compute.manager [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:43:38Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=63,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=6c2e3b38-f3d2-472d-a728-f41167dcf407,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:43:39Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:44:41.074 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:41.076 DEBUG nova.objects.instance [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 6c2e3b38-f3d2-472d-a728-f41167dcf407 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:41.166 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.092s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:41.402 DEBUG oslo_concurrency.lockutils [req-c5dabfe5-5c59-4659-87cc-672d426dad17 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "6c2e3b38-f3d2-472d-a728-f41167dcf407" released by "do_terminate_instance" :: held 14.624s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:41.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:41.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:41.726 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.146s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:41.735 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:44:41.735 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:9817f64c-00e7-ff9b-d216-5d5113314224, VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:41.743 DEBUG nova.virt.xenapi.vm_utils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:5becc6d7-58a5-d106-fed0-d513ae21d973 for VM OpaqueRef:9817f64c-00e7-ff9b-d216-5d5113314224, VDI OpaqueRef:52b7c606-55a0-03ef-078b-9a9550f91433. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:41.745 DEBUG nova.objects.instance [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `pci_devices' on Instance uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:41.855 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:42.051 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.052 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:42.053 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.059 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "store_auto_disk_config" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:42.060 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Injecting hostname (tempest.common.compute-instance-155064922) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:44:42.061 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.068 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:42.069 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:44:42.069 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.270 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_nwinfo" :: held 0.201s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:42.271 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:42.444 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:44:42.449 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:44:42.456 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Created VIF OpaqueRef:af7da284-6d83-ca38-b963-1678b4b44e80, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:44:42.457 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:42.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:42.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:42.623 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:42.762 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.801 INFO nova.compute.manager [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Starting instance... 2015-08-07 17:44:42.972 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:42.973 DEBUG nova.compute.resource_tracker [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:44:42.976 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:44:42.977 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:44:42.977 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:44:42.977 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:44:42.978 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] disk limit not specified, defaulting to unlimited 2015-08-07 17:44:42.998 DEBUG nova.compute.resources.vcpu [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:44:42.999 DEBUG nova.compute.resources.vcpu [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:44:42.999 INFO nova.compute.claims [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Claim successful 2015-08-07 17:44:43.219 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.247s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:43.600 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:43.676 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.077s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:43.678 DEBUG nova.compute.utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:44:43.682 13318 DEBUG nova.compute.manager [-] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:44:43.684 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-420208c3-bf05-4834-98b5-1c8afda79f97" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:44.072 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:44:44.087 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:44:44.088 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:44.261 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:44:44.272 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:44.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:44.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:44:44.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:44:44.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:44:44.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:44:44.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:44:44.603 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:44.604 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:45.034 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:40:04', 'active': False, 'type': u'bridge', 'id': u'05bc5db0-7a1d-4a7a-afed-5f08489f238b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:45.043 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:45.057 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:45.058 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:44:45.059 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:45.204 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:3e6642f0-5b0e-60f3-82a3-b30d22baa2a6 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:44:45.416 13318 DEBUG nova.network.base_api [-] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:c0:98', 'active': False, 'type': u'bridge', 'id': u'42ca3625-fdf9-4708-a7e8-513c0601954f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:45.443 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-420208c3-bf05-4834-98b5-1c8afda79f97" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:45.443 13318 DEBUG nova.compute.manager [-] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:4b:c0:98', 'active': False, 'type': u'bridge', 'id': u'42ca3625-fdf9-4708-a7e8-513c0601954f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:44:45.580 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:45.598 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:45.764 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:45.769 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:45.770 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:45.775 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:45.776 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:45.870 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.598s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:45.871 INFO nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 1.61 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:44:45.928 DEBUG nova.virt.xenapi.vmops [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:46.051 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:46.051 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:46.112 DEBUG nova.compute.manager [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:46.480 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:46.541 DEBUG oslo_concurrency.lockutils [req-4e0bfc6b-ec8c-4332-ab48-3e7910b5e202 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" released by "_locked_do_build_and_run_instance" :: held 21.560s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:46.626 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:46.660 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 18.028s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:46.790 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:46.810 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:46.811 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:46.989 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:160a41ef-1775-960e-de8d-90cce4529d16, VDI OpaqueRef:3e6642f0-5b0e-60f3-82a3-b30d22baa2a6 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:46.998 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:a2b11d63-3450-6347-464e-2939a2ec3c9d for VM OpaqueRef:160a41ef-1775-960e-de8d-90cce4529d16, VDI OpaqueRef:3e6642f0-5b0e-60f3-82a3-b30d22baa2a6. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:47.253 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:47.265 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:47.266 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:ea1bb293-c1f4-f0df-ccf9-27d1e82a9f6d, VDI OpaqueRef:f3c324de-022e-64a9-0a51-1a2676aba693 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:47.275 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:eab7e735-b2c9-f2bc-25c0-c8f9b84d5bd3 for VM OpaqueRef:ea1bb293-c1f4-f0df-ccf9-27d1e82a9f6d, VDI OpaqueRef:f3c324de-022e-64a9-0a51-1a2676aba693. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:47.346 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:44:47.350 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:47.363 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:12289bee-34de-5580-32ae-5733b198a204 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:47.364 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:12289bee-34de-5580-32ae-5733b198a204 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:44:47.364 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:48.329 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:48.359 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:48.509 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.145s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:48.510 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:12289bee-34de-5580-32ae-5733b198a204 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:44:48.513 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:12289bee-34de-5580-32ae-5733b198a204 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:44:48.538 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:48.539 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:48.540 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:48.544 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:48.545 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:48.590 WARNING nova.virt.configdrive [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:44:48.591 DEBUG nova.objects.instance [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 420208c3-bf05-4834-98b5-1c8afda79f97 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:48.620 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpUnykeA/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpuzgeuL execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:48.707 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpUnykeA/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpuzgeuL" returned: 0 in 0.087s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:48.713 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUnykeA/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:48.799 DEBUG nova.virt.xenapi.vmops [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:49.111 DEBUG nova.compute.manager [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:49.443 DEBUG oslo_concurrency.lockutils [req-a32bf6ef-39d5-4885-ad00-3ab7a8a57cff tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "_locked_do_build_and_run_instance" :: held 21.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:49.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:49.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:44:49.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:50.897 INFO nova.compute.manager [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Rescuing 2015-08-07 17:44:50.899 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Acquired semaphore "refresh_cache-b4eac7df-4935-4b77-8307-8c8cabe2c038" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:44:51.012 DEBUG nova.network.base_api [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.8'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ae:e2:5a', 'active': False, 'type': u'bridge', 'id': u'f9d7f841-808e-40fe-99f2-c904249e4655', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:44:51.034 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Releasing semaphore "refresh_cache-b4eac7df-4935-4b77-8307-8c8cabe2c038" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:44:51.372 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:44:52.119 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUnykeA/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.406s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:52.121 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:44:52.436 DEBUG oslo_concurrency.processutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.315s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:44:52.440 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:44:52.442 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:52.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:52.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:53.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:53.556 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:44:53.557 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:44:53.699 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.257s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:53.713 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:44:53.714 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:160a41ef-1775-960e-de8d-90cce4529d16, VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:53.731 DEBUG nova.virt.xenapi.vm_utils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:9ecbb87f-31df-0b0d-e41b-6779b79cc452 for VM OpaqueRef:160a41ef-1775-960e-de8d-90cce4529d16, VDI OpaqueRef:8ead6548-7a98-e125-4d1d-fc307620b7e9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:53.733 DEBUG nova.objects.instance [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 420208c3-bf05-4834-98b5-1c8afda79f97 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:53.760 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:53.761 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:44:53.808 WARNING nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] VM already halted, skipping shutdown... 2015-08-07 17:44:53.834 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:44:53.835 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:53.843 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:53.999 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.045 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.046 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.046 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.053 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" released by "store_auto_disk_config" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.055 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Injecting hostname (tempest-server-255189760) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:44:54.056 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.064 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.064 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:44:54.065 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.206 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.445s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.239 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" released by "update_nwinfo" :: held 0.174s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.240 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:54.386 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 0 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:44:54.387 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:44:54.387 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=0 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:44:54.388 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:54.428 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:44:54.433 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:44:54.441 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Created VIF OpaqueRef:28b594d7-ad96-f104-d198-48922c0aea7e, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:44:54.442 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:54.646 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:44:54.808 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 115845a8-7055-44f5-a3f5-222bd027f1ed 2015-08-07 17:44:54.809 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:44:54.946 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 8 2015-08-07 17:44:54.947 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=1060MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=8 pci_stats=None 2015-08-07 17:44:54.996 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:44:54.997 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.609s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:54.999 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:44:55.000 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:55.026 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:44:57.889 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Cloned VDI OpaqueRef:4403ade9-a1a8-7ad8-73f6-d8d9ad92a7cd from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:44:58.406 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 4.406s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:58.407 INFO nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Image creation data, cacheable: True, downloaded: False duration: 4.41 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:44:58.902 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.065 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.182 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:44:59.204 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.220 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:44:59.232 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:44:59.232 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.356 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:44:59.356 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:44:59.357 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:59.360 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-420208c3-bf05-4834-98b5-1c8afda79f97" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:44:59.361 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.372 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:44:59.372 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:4403ade9-a1a8-7ad8-73f6-d8d9ad92a7cd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:59.380 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:f05e0d5d-2cf4-0bfb-407d-8e5ca57c785d for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:4403ade9-a1a8-7ad8-73f6-d8d9ad92a7cd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:59.564 DEBUG nova.virt.xenapi.vmops [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:44:59.655 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:44:59.658 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:44:59.669 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:4f28e634-c289-6ff7-f552-b42b0a5d766f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:44:59.670 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:4f28e634-c289-6ff7-f552-b42b0a5d766f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:44:59.671 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:44:59.704 DEBUG nova.compute.manager [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:44:59.930 DEBUG oslo_concurrency.lockutils [req-fadb9d5d-72d0-4cd3-8d77-b39c9fb51d68 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97" released by "_locked_do_build_and_run_instance" :: held 17.168s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:00.604 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 0.933s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:00.604 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:4f28e634-c289-6ff7-f552-b42b0a5d766f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:45:00.607 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VBD OpaqueRef:4f28e634-c289-6ff7-f552-b42b0a5d766f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:45:00.679 WARNING nova.virt.configdrive [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:45:00.680 DEBUG nova.objects.instance [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `ec2_ids' on Instance uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:00.701 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): genisoimage -o /tmp/tmpnhmtNU/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJGydqV execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:00.789 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "genisoimage -o /tmp/tmpnhmtNU/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJGydqV" returned: 0 in 0.088s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:00.796 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpnhmtNU/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:03.199 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:03.263 INFO nova.compute.manager [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Starting instance... 2015-08-07 17:45:03.565 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:03.566 DEBUG nova.compute.resource_tracker [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:45:03.576 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:45:03.577 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Total memory: 8187 MB, used: 1060.00 MB 2015-08-07 17:45:03.577 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] memory limit: 12280.50 MB, free: 11220.50 MB 2015-08-07 17:45:03.578 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:45:03.578 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] disk limit not specified, defaulting to unlimited 2015-08-07 17:45:03.605 DEBUG nova.compute.resources.vcpu [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 7.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:45:03.606 DEBUG nova.compute.resources.vcpu [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:45:03.607 INFO nova.compute.claims [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Claim successful 2015-08-07 17:45:03.984 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.419s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:04.001 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:04.002 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.52 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:04.244 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:04.359 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.116s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:04.360 DEBUG nova.compute.utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:45:04.365 13318 DEBUG nova.compute.manager [-] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:45:04.367 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-127dae8a-26f4-4d23-862d-d23bbd64ca16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:04.922 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:45:04.944 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:45:04.944 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:05.074 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:05.280 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:45:05.290 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:07.240 13318 DEBUG nova.network.base_api [-] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:d6:88', 'active': False, 'type': u'bridge', 'id': u'98c7f15c-9852-40e8-8f4c-9b3c40c51a1f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:07.255 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:c25f7e7c-e5c3-c86a-a84e-01d83a649bd1 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:45:07.275 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-127dae8a-26f4-4d23-862d-d23bbd64ca16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:07.276 13318 DEBUG nova.compute.manager [-] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.9'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1f:d6:88', 'active': False, 'type': u'bridge', 'id': u'98c7f15c-9852-40e8-8f4c-9b3c40c51a1f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:45:07.706 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpnhmtNU/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 6.910s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:07.708 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:08.045 DEBUG oslo_concurrency.processutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.336s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:08.049 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:45:08.050 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:08.112 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.822s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:08.113 INFO nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 2.83 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:45:08.747 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:08.843 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.793s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:08.850 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:45:08.851 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:08.860 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:3dc7389c-0ab5-6b2e-1365-bae1b57a3448 for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:0b59a911-1424-a3ce-6f05-3117e22e7fcc. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:08.861 DEBUG nova.objects.instance [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `pci_devices' on Instance uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:08.985 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:08.999 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:09.178 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:09.178 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:09.179 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:09.186 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:09.187 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Injecting hostname (RESCUE-tempest.common.compute-instance-155064922) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:45:09.188 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:09.194 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:45:09.202 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_hostname" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:09.203 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:45:09.204 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:09.212 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:45:09.212 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:09.375 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_nwinfo" :: held 0.171s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:09.376 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:09.399 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:7080a1fb-1888-2a74-0eb0-665f96bf2ee3, VDI OpaqueRef:c25f7e7c-e5c3-c86a-a84e-01d83a649bd1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:09.406 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:d0a686e0-94b0-5e66-ca1c-6a06d62be7ee for VM OpaqueRef:7080a1fb-1888-2a74-0eb0-665f96bf2ee3, VDI OpaqueRef:c25f7e7c-e5c3-c86a-a84e-01d83a649bd1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:09.543 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:45:09.551 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:45:09.559 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Created VIF OpaqueRef:8e5472bb-df10-563b-9538-569e9aa53441, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:45:09.560 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:09.837 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:fbaac169-cc11-209d-c49d-96e2a6f51202 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:09.847 DEBUG nova.virt.xenapi.vm_utils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:08410c96-8046-2576-8cc1-5cbfbd277791 for VM OpaqueRef:7358035d-1087-d21f-4c67-d3269f0f6904, VDI OpaqueRef:fbaac169-cc11-209d-c49d-96e2a6f51202. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:09.849 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:09.876 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:45:09.881 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:09.893 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:6a67025d-df10-7201-dccd-e3cacaf91962 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:09.894 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:6a67025d-df10-7201-dccd-e3cacaf91962 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:45:09.894 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:10.029 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:45:11.066 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.172s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:11.067 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:6a67025d-df10-7201-dccd-e3cacaf91962 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:45:11.070 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:6a67025d-df10-7201-dccd-e3cacaf91962 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:45:11.149 WARNING nova.virt.configdrive [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:45:11.150 DEBUG nova.objects.instance [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 127dae8a-26f4-4d23-862d-d23bbd64ca16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:11.185 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpjfakYd/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqDy4sh execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:11.264 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpjfakYd/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqDy4sh" returned: 0 in 0.079s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:11.271 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpjfakYd/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:14.197 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpjfakYd/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 2.926s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:14.200 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:14.419 DEBUG oslo_concurrency.processutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.218s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:14.421 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:45:14.423 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:14.647 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:45:14.651 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:14.671 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:5b23c64e-5efa-d3da-b4f0-d8a4287f4eb8 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:14.671 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:5b23c64e-5efa-d3da-b4f0-d8a4287f4eb8 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:45:15.049 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:15.090 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.667s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:15.091 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.418s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:15.095 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:45:15.096 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:7080a1fb-1888-2a74-0eb0-665f96bf2ee3, VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:15.121 DEBUG nova.virt.xenapi.vm_utils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:c2961096-b459-2a4a-99cc-8251120940d0 for VM OpaqueRef:7080a1fb-1888-2a74-0eb0-665f96bf2ee3, VDI OpaqueRef:2e9a651b-50a9-7e21-a0ad-ed8f893e4356. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:15.122 DEBUG nova.objects.instance [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 127dae8a-26f4-4d23-862d-d23bbd64ca16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:15.225 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:15.393 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:15.394 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:15.394 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:15.405 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "store_auto_disk_config" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:15.406 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Injecting hostname (tempest-server-255189760) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:45:15.406 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:15.413 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:15.414 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:45:15.415 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "update_nwinfo" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:15.561 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "update_nwinfo" :: held 0.146s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:15.562 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:15.704 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:45:15.711 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:45:15.717 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Created VIF OpaqueRef:34115bb3-5dd1-8ad6-f38a-02132dd5e806, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:45:15.718 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:16.071 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:45:16.245 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.154s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:16.245 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:5b23c64e-5efa-d3da-b4f0-d8a4287f4eb8 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:45:16.248 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:5b23c64e-5efa-d3da-b4f0-d8a4287f4eb8 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:45:16.315 WARNING nova.virt.configdrive [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:45:16.316 DEBUG nova.objects.instance [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:16.347 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmp6ZZ3D1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpMY1Q88 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:16.423 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmp6ZZ3D1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpMY1Q88" returned: 0 in 0.076s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:16.428 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp6ZZ3D1/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:16.985 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:45:17.004 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:17.241 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:45:17.242 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:45:17.243 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:17.247 DEBUG oslo_concurrency.lockutils [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:17.248 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:17.559 DEBUG nova.virt.xenapi.vmops [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:17.866 DEBUG nova.compute.manager [req-8cb2558f-e53d-45f5-bdab-f7df53dff28d tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:19.357 INFO nova.compute.manager [req-722a1bdd-7ec8-4b81-8f82-1b72cf4d7e9a tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Pausing 2015-08-07 17:45:19.432 DEBUG nova.compute.manager [req-722a1bdd-7ec8-4b81-8f82-1b72cf4d7e9a tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:19.685 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp6ZZ3D1/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 3.257s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:19.688 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:19.930 DEBUG oslo_concurrency.processutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.241s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:19.933 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:45:19.935 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:20.993 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.058s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:21.002 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:45:21.003 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:ea1bb293-c1f4-f0df-ccf9-27d1e82a9f6d, VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:21.014 DEBUG nova.virt.xenapi.vm_utils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:38142e36-bd8f-5219-82d7-f6055eb75985 for VM OpaqueRef:ea1bb293-c1f4-f0df-ccf9-27d1e82a9f6d, VDI OpaqueRef:ff1c8691-bfb0-4c9c-0dcc-d48b5247769f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:21.016 DEBUG nova.objects.instance [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:21.121 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:21.122 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:21.123 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:21.132 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:21.134 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:45:21.134 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:21.191 INFO nova.compute.manager [req-d6efbd86-7581-4306-8517-4d0653d902ba tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Unpausing 2015-08-07 17:45:21.329 DEBUG nova.compute.manager [req-d6efbd86-7581-4306-8517-4d0653d902ba tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:21.334 DEBUG oslo_concurrency.lockutils [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-115845a8-7055-44f5-a3f5-222bd027f1ed" released by "update_nwinfo" :: held 0.200s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:21.335 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:45:21.344 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:45:21.356 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Created VIF OpaqueRef:db4f01d1-012d-322b-02f1-0a5ca03a4e71, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:45:21.357 DEBUG nova.virt.xenapi.vmops [req-e7299f84-7d21-423d-9497-176a584cc6bf tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:22.312 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:45:22.321 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:22.494 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:45:22.494 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:45:22.495 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:22.500 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:22.500 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:22.666 DEBUG nova.virt.xenapi.vmops [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:22.820 DEBUG nova.compute.manager [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:23.087 DEBUG oslo_concurrency.lockutils [req-488e179e-bcd3-42a3-a23c-ab18e152f6d3 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "_locked_do_build_and_run_instance" :: held 19.888s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:23.123 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:23.123 DEBUG nova.compute.manager [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Going to confirm migration 6 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 17:45:24.020 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:24.167 DEBUG nova.network.base_api [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:24.203 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:24.208 WARNING nova.virt.xenapi.vm_utils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] VM already halted, skipping shutdown... 2015-08-07 17:45:24.217 DEBUG nova.virt.xenapi.vmops [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:45:24.224 DEBUG nova.virt.xenapi.vm_utils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI af2dddaa-3b62-4306-8eb4-08cbd9b685cd is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:24.270 DEBUG nova.virt.xenapi.vm_utils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 98092af1-40ef-46f7-aa3b-2582b5c31cd7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:24.724 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:24.725 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:24.725 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:24.727 INFO nova.compute.manager [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Terminating instance 2015-08-07 17:45:24.728 INFO nova.virt.xenapi.vmops [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Destroying VM 2015-08-07 17:45:24.734 DEBUG nova.virt.xenapi.vm_utils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:45:24.907 DEBUG nova.virt.xenapi.vmops [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:45:24.918 DEBUG nova.virt.xenapi.vm_utils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:45:24.949 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:24.996 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "drop_move_claim" :: held 0.048s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:24.997 DEBUG nova.compute.manager [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Resized/migrated instance is powered off. Setting vm_state to 'stopped'. _confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3331 2015-08-07 17:45:25.055 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:25.071 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:25.072 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:25.072 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:25.074 INFO nova.compute.manager [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Terminating instance 2015-08-07 17:45:25.075 INFO nova.virt.xenapi.vmops [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Destroying VM 2015-08-07 17:45:25.082 DEBUG nova.virt.xenapi.vm_utils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:45:25.152 DEBUG oslo_concurrency.lockutils [req-1fb361b3-2086-4c88-bec7-6f18100195ec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" released by "do_confirm_resize" :: held 2.029s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:25.900 DEBUG oslo_concurrency.lockutils [req-a5066ddc-a9ff-4a23-888a-38d6fc7333ab tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:25.915 INFO nova.compute.manager [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Rescuing 2015-08-07 17:45:25.916 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Acquired semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:26.042 DEBUG nova.network.base_api [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:26.060 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Releasing semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:26.098 DEBUG nova.network.base_api [req-a5066ddc-a9ff-4a23-888a-38d6fc7333ab tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:97:17:64', 'active': False, 'type': u'bridge', 'id': u'd2af0ccf-a764-449d-b25c-7353a5007c58', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:26.125 DEBUG oslo_concurrency.lockutils [req-a5066ddc-a9ff-4a23-888a-38d6fc7333ab tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-115845a8-7055-44f5-a3f5-222bd027f1ed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:26.156 DEBUG nova.virt.xenapi.vmops [req-a5066ddc-a9ff-4a23-888a-38d6fc7333ab tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:45:26.433 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:45:27.237 DEBUG nova.virt.xenapi.vmops [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:45:27.254 DEBUG nova.virt.xenapi.vm_utils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 90584c4f-c50d-4795-97bf-945a75b50d40 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:27.271 DEBUG nova.virt.xenapi.vm_utils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 24b02e4a-d010-4940-9d15-c646a8b0a56e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:27.437 DEBUG nova.virt.xenapi.vmops [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:45:27.449 DEBUG nova.virt.xenapi.vm_utils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 669668f4-2f5c-4b24-bb7f-666d0262c630 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:27.464 DEBUG nova.virt.xenapi.vm_utils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 120cd7df-c823-4a92-b924-1b1018ec6a0c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:27.935 DEBUG nova.virt.xenapi.vmops [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:45:27.945 DEBUG nova.virt.xenapi.vm_utils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:45:27.945 DEBUG nova.compute.manager [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:45:28.729 WARNING nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] VM already halted, skipping shutdown... 2015-08-07 17:45:28.755 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:45:28.756 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:28.921 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:29.206 DEBUG nova.compute.manager [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:44:42Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=66,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=420208c3-bf05-4834-98b5-1c8afda79f97,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:44:43Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:45:29.352 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:29.354 DEBUG nova.objects.instance [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 420208c3-bf05-4834-98b5-1c8afda79f97 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:29.418 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.066s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:29.445 DEBUG nova.virt.xenapi.vmops [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:45:29.457 DEBUG nova.virt.xenapi.vm_utils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:45:29.458 DEBUG nova.compute.manager [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:45:29.652 DEBUG oslo_concurrency.lockutils [req-f446e31a-feb1-4926-bc23-f9b94bd71366 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "420208c3-bf05-4834-98b5-1c8afda79f97" released by "do_terminate_instance" :: held 4.927s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:30.666 DEBUG nova.compute.manager [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:45:02Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=67,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=127dae8a-26f4-4d23-862d-d23bbd64ca16,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:45:04Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:45:30.798 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:30.798 DEBUG nova.objects.instance [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 127dae8a-26f4-4d23-862d-d23bbd64ca16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:30.859 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.061s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:31.122 DEBUG oslo_concurrency.lockutils [req-441a0373-8d87-41ce-a960-a552a47e28a6 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "127dae8a-26f4-4d23-862d-d23bbd64ca16" released by "do_terminate_instance" :: held 6.050s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:31.803 DEBUG nova.compute.manager [req-a5066ddc-a9ff-4a23-888a-38d6fc7333ab tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:32.532 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:32.572 INFO nova.compute.manager [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Starting instance... 2015-08-07 17:45:32.708 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:32.709 DEBUG nova.compute.resource_tracker [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:45:32.714 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:45:32.714 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Total memory: 8187 MB, used: 922.00 MB 2015-08-07 17:45:32.715 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] memory limit: 12280.50 MB, free: 11358.50 MB 2015-08-07 17:45:32.715 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:45:32.716 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] disk limit not specified, defaulting to unlimited 2015-08-07 17:45:32.735 DEBUG nova.compute.resources.vcpu [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:45:32.736 DEBUG nova.compute.resources.vcpu [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:45:32.736 INFO nova.compute.claims [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Claim successful 2015-08-07 17:45:32.970 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.263s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:33.095 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" acquired by "do_terminate_instance" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:33.096 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:33.097 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:33.100 INFO nova.compute.manager [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Terminating instance 2015-08-07 17:45:33.103 INFO nova.virt.xenapi.vmops [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Destroying VM 2015-08-07 17:45:33.110 DEBUG nova.virt.xenapi.vm_utils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:45:33.148 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:33.242 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.095s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:33.244 DEBUG nova.compute.utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:45:33.250 13318 DEBUG nova.compute.manager [-] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:45:33.251 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-1d45573b-0e05-4067-ab3e-56616334b7d9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:33.598 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:45:33.613 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:45:33.613 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:33.775 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:45:34.833 DEBUG nova.virt.xenapi.vmops [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:45:34.843 DEBUG nova.virt.xenapi.vm_utils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI be718769-8188-4d39-a229-061438de06d8 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:34.850 DEBUG nova.virt.xenapi.vm_utils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 2df6d554-6118-435e-b4a0-11110b456f96 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:34.859 13318 DEBUG nova.network.base_api [-] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c3:c1:47', 'active': False, 'type': u'bridge', 'id': u'09f98fba-79c3-4e06-a368-97256fe28bda', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:34.883 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-1d45573b-0e05-4067-ab3e-56616334b7d9" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:34.884 13318 DEBUG nova.compute.manager [-] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:c3:c1:47', 'active': False, 'type': u'bridge', 'id': u'09f98fba-79c3-4e06-a368-97256fe28bda', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:45:35.059 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:35.416 DEBUG nova.virt.xenapi.vmops [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:45:35.427 DEBUG nova.virt.xenapi.vm_utils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:45:35.428 DEBUG nova.compute.manager [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:45:36.650 DEBUG nova.compute.manager [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:43:25Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=62,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=115845a8-7055-44f5-a3f5-222bd027f1ed,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:43:27Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:45:36.784 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:36.785 DEBUG nova.objects.instance [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid 115845a8-7055-44f5-a3f5-222bd027f1ed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:36.871 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.087s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:37.080 DEBUG oslo_concurrency.lockutils [req-4f9d2b45-cd72-423b-aba9-6392c3507d33 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "115845a8-7055-44f5-a3f5-222bd027f1ed" released by "do_terminate_instance" :: held 3.985s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:37.371 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Cloned VDI OpaqueRef:33713e8d-a92d-9d8f-5756-0302e1746acf from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:45:37.849 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 8.928s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:37.850 INFO nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Image creation data, cacheable: True, downloaded: False duration: 8.94 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:45:37.851 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 4.069s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:38.597 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:596c95db-b0cf-e84d-9b7a-4c35d9a3a32e from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:45:38.727 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:38.894 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:39.110 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.259s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:39.111 INFO nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 5.34 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:45:39.181 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:39.218 INFO nova.compute.manager [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance... 2015-08-07 17:45:39.246 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:45:39.258 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:45:39.259 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:39.407 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:39.408 DEBUG nova.compute.resource_tracker [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:45:39.412 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:45:39.413 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:45:39.413 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:45:39.413 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:45:39.414 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] disk limit not specified, defaulting to unlimited 2015-08-07 17:45:39.418 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:45:39.419 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:33713e8d-a92d-9d8f-5756-0302e1746acf ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:39.425 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:1fc790a5-5178-2f1a-6a6e-277f5d8021b4 for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:33713e8d-a92d-9d8f-5756-0302e1746acf. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:39.442 DEBUG nova.compute.resources.vcpu [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:45:39.442 DEBUG nova.compute.resources.vcpu [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:45:39.443 INFO nova.compute.claims [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Claim successful 2015-08-07 17:45:39.621 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:39.669 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.261s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:39.699 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:45:39.703 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:39.714 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:59d362eb-1675-4925-3fbd-fff5712db964 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:39.715 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:59d362eb-1675-4925-3fbd-fff5712db964 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:45:39.715 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:39.791 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:39.834 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:39.892 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.057s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:39.892 DEBUG nova.compute.utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:45:39.895 13318 DEBUG nova.compute.manager [-] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:45:39.895 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:39.957 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:45:39.969 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:45:39.970 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:40.135 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:12b385ee-1d62-9dff-3d70-42a33afc7d76, VDI OpaqueRef:596c95db-b0cf-e84d-9b7a-4c35d9a3a32e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:40.142 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:b54855f8-35e2-0ed4-7797-2219d767b15c for VM OpaqueRef:12b385ee-1d62-9dff-3d70-42a33afc7d76, VDI OpaqueRef:596c95db-b0cf-e84d-9b7a-4c35d9a3a32e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:40.273 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:45:40.290 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:45:40.291 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:40.456 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:45:40.464 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:40.574 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:45:40.577 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:40.588 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:3e68557d-e3d6-a20a-ea2e-3237d13c809e for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:40.589 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:3e68557d-e3d6-a20a-ea2e-3237d13c809e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:45:40.770 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.055s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:40.771 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:59d362eb-1675-4925-3fbd-fff5712db964 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:45:40.772 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.183s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:40.774 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VBD OpaqueRef:59d362eb-1675-4925-3fbd-fff5712db964 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:45:40.845 WARNING nova.virt.configdrive [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:45:40.846 DEBUG nova.objects.instance [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `ec2_ids' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:40.880 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): genisoimage -o /tmp/tmp3HwvWP/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpBkU2FJ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:40.971 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "genisoimage -o /tmp/tmp3HwvWP/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpBkU2FJ" returned: 0 in 0.091s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:40.977 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp3HwvWP/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:41.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:41.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:41.825 13318 DEBUG nova.network.base_api [-] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:41.854 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:41.855 13318 DEBUG nova.compute.manager [-] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:45:42.655 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.883s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:42.656 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:3e68557d-e3d6-a20a-ea2e-3237d13c809e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:45:42.658 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:3e68557d-e3d6-a20a-ea2e-3237d13c809e plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:45:42.737 WARNING nova.virt.configdrive [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:45:42.738 DEBUG nova.objects.instance [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 1d45573b-0e05-4067-ab3e-56616334b7d9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:42.774 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpYK2kRD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp0e5LDM execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:42.865 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpYK2kRD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp0e5LDM" returned: 0 in 0.091s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:42.870 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYK2kRD/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:44.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:44.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:45.050 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:45.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:45.523 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 17:45:45.550 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp3HwvWP/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 4.573s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:45.552 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:45.932 ERROR oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Error during ComputeManager._poll_bandwidth_usage 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py", line 218, in run_periodic_tasks 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task task(self, context) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/compute/manager.py", line 5680, in _poll_bandwidth_usage 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task update_cells=update_cells) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 195, in wrapper 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task ctxt, self, fn.__name__, args, kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/rpcapi.py", line 248, in object_action 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task objmethod=objmethod, args=args, kwargs=kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task retry=self.retry) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task timeout=timeout, retry=retry) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 431, in send 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task retry=retry) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 422, in _send 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task raise result 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task Traceback (most recent call last): 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/conductor/manager.py", line 442, in _object_dispatch 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task return getattr(target, method)(*args, **kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/base.py", line 493, in wrapper 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task return fn(obj, *args, **kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/usr/local/lib/python2.7/dist-packages/oslo_versionedobjects/base.py", line 211, in wrapper 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task return fn(self, *args, **kwargs) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 69, in create 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task self._from_db_object(self._context, self, db_bw_usage) 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task File "/opt/stack/new/nova/nova/objects/bandwidth_usage.py", line 42, in _from_db_object 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task bw_usage[field] = db_bw_usage['uuid'] 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task TypeError: 'NoneType' object has no attribute '__getitem__' 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.932 13318 ERROR oslo_service.periodic_task 2015-08-07 17:45:45.934 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:46.168 DEBUG oslo_concurrency.processutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.616s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:46.169 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:45:46.171 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:46.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:46.514 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:46.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:46.521 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:45:46.613 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:46.722 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:14:49:68', 'active': False, 'type': u'bridge', 'id': u'e63eea3a-8bbb-438c-b0a8-394f201769a5', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:46.748 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-af2ef72d-4895-4de0-bd40-aaa2ac498091" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:46.749 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:45:46.750 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:47.694 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpYK2kRD/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 4.824s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:47.696 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:45:47.917 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.746s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:47.928 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:45:47.929 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:47.940 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:ff01e22d-5476-9273-2525-587af02b4bec for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:a3b31e45-721d-acbf-927c-7b0345100c53. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:47.948 DEBUG nova.objects.instance [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `pci_devices' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:47.979 DEBUG oslo_concurrency.processutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.283s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:45:47.980 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:45:47.982 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.039 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:48.220 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.221 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.221 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.231 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.232 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting hostname (RESCUE-tempest.common.compute-instance-1031488919) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:45:48.233 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.241 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.241 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:45:48.242 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.402 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_nwinfo" :: held 0.160s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.403 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:48.583 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:45:48.590 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:45:48.598 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VIF OpaqueRef:8723cfcb-e7e1-3a27-d13a-d1c84fd385d3, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:45:48.599 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:48.633 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.651s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.640 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:45:48.641 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:12b385ee-1d62-9dff-3d70-42a33afc7d76, VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:48.648 DEBUG nova.virt.xenapi.vm_utils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:b43b8158-c8b1-0f8b-5a32-ebb99b0384ea for VM OpaqueRef:12b385ee-1d62-9dff-3d70-42a33afc7d76, VDI OpaqueRef:6ef8f7cd-9140-eef9-f429-c0d567e0a9dd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:48.648 DEBUG nova.objects.instance [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 1d45573b-0e05-4067-ab3e-56616334b7d9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:45:48.744 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:48.764 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:48.770 DEBUG nova.virt.xenapi.vm_utils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:640af533-4579-8384-b2af-a80a2290feee for VM OpaqueRef:d5acf784-9ebe-5493-c25b-9d35e2c83f53, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:48.771 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:48.902 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.902 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.903 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.909 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" released by "store_auto_disk_config" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.910 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Injecting hostname (tempest.common.compute-instance-1871117673) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:45:48.910 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:48.916 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:45:48.949 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" released by "update_hostname" :: held 0.039s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:48.950 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:45:48.950 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:49.111 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" released by "update_nwinfo" :: held 0.160s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:49.111 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:49.274 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:45:49.279 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:45:49.286 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Created VIF OpaqueRef:8ec7f3c7-5faf-657b-2be4-2242aada48f0, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:45:49.287 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:49.472 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:45:49.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:49.578 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:50.586 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:50.586 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:45:50.587 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:53.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:53.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:54.433 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:45:54.451 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:54.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:54.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:54.641 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:45:54.641 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:45:54.642 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:54.645 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-1d45573b-0e05-4067-ab3e-56616334b7d9" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:54.646 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:54.800 DEBUG nova.virt.xenapi.vmops [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:54.989 DEBUG nova.compute.manager [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:55.056 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:55.207 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:45:55.225 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:55.263 DEBUG oslo_concurrency.lockutils [req-bcf68026-02fe-4ee2-822a-2781eb743305 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9" released by "_locked_do_build_and_run_instance" :: held 22.731s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:55.398 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:45:55.398 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:45:55.399 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:55.403 DEBUG oslo_concurrency.lockutils [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:55.404 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:55.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:45:55.559 DEBUG nova.virt.xenapi.vmops [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:55.566 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:45:55.566 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:45:55.742 DEBUG nova.compute.manager [req-b850b310-ed13-4ef7-a461-6bb5190862a0 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:45:55.752 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:55.752 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:45:56.221 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.470s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:56.410 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:45:56.411 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:45:56.411 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:45:56.413 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:56.728 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:56.729 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:56.730 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:56.731 INFO nova.compute.manager [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Terminating instance 2015-08-07 17:45:56.732 INFO nova.virt.xenapi.vmops [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Destroying VM 2015-08-07 17:45:56.754 DEBUG nova.virt.xenapi.vm_utils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:45:56.788 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 10 2015-08-07 17:45:56.789 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=10 pci_stats=None 2015-08-07 17:45:56.868 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:45:56.869 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.456s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:56.870 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:45:57.259 INFO nova.compute.manager [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Unrescuing 2015-08-07 17:45:57.260 DEBUG oslo_concurrency.lockutils [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Acquired semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:45:57.404 DEBUG nova.network.base_api [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:45:57.429 DEBUG oslo_concurrency.lockutils [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Releasing semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:45:57.626 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:5284c951-8a65-5a80-3c77-c9685fc46b27 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:45:58.139 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 17.676s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:45:58.140 INFO nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: False duration: 17.68 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:45:58.678 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:58.874 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:59.022 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:45:59.037 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:45:59.038 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:45:59.209 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:46ca7ff7-782b-3c79-ad45-d6f793292aa2, VDI OpaqueRef:5284c951-8a65-5a80-3c77-c9685fc46b27 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:59.218 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:1490ad12-a92d-fd24-dc02-9a9742d0fca3 for VM OpaqueRef:46ca7ff7-782b-3c79-ad45-d6f793292aa2, VDI OpaqueRef:5284c951-8a65-5a80-3c77-c9685fc46b27. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:59.491 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:45:59.494 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:45:59.504 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:fe74fa88-dc96-8a15-e4a9-7d012a7e546c for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:45:59.505 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:fe74fa88-dc96-8a15-e4a9-7d012a7e546c ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:45:59.506 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:45:59.657 DEBUG nova.virt.xenapi.vmops [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:45:59.667 DEBUG nova.virt.xenapi.vm_utils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 998ac43d-9913-4e52-8c65-3ab55ee41a9d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:45:59.678 DEBUG nova.virt.xenapi.vm_utils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI f7a8660e-4198-4033-b401-10b26147801d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:00.395 DEBUG nova.virt.xenapi.vmops [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:46:00.416 DEBUG nova.virt.xenapi.vm_utils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:46:00.417 DEBUG nova.compute.manager [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:46:00.785 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.280s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:00.786 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:fe74fa88-dc96-8a15-e4a9-7d012a7e546c done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:46:00.788 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:fe74fa88-dc96-8a15-e4a9-7d012a7e546c plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:46:00.851 WARNING nova.virt.configdrive [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:46:00.852 DEBUG nova.objects.instance [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:00.885 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmp2pnmxA/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpg4ElXT execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:00.969 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmp2pnmxA/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpg4ElXT" returned: 0 in 0.084s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:00.973 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2pnmxA/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:01.085 DEBUG nova.virt.xenapi.vm_utils [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI e0a3ba41-bd7a-4693-b034-8d6817333512 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:01.095 DEBUG nova.virt.xenapi.vm_utils [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 3392b62c-45ab-4df5-9041-8f9ee1d478eb is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:01.104 DEBUG nova.virt.xenapi.vm_utils [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 45f0444b-41c6-401c-93d6-0e9b8dcc5f76 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:02.354 DEBUG nova.compute.manager [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:45:32Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=68,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=1d45573b-0e05-4067-ab3e-56616334b7d9,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:45:33Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:46:02.402 DEBUG nova.virt.xenapi.vmops [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:02.542 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:02.544 DEBUG nova.objects.instance [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 1d45573b-0e05-4067-ab3e-56616334b7d9 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:02.636 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.094s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:03.206 DEBUG oslo_concurrency.lockutils [req-d92b61bd-838b-4d33-b268-c0d927bdc49e tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "1d45573b-0e05-4067-ab3e-56616334b7d9" released by "do_terminate_instance" :: held 6.478s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:04.736 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp2pnmxA/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.764s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:04.739 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:04.874 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:04.887 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 16.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:04.888 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:04.944 INFO nova.compute.manager [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Starting instance... 2015-08-07 17:46:05.030 DEBUG oslo_concurrency.processutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.291s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:05.031 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:46:05.032 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:05.052 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:05.182 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:05.183 DEBUG nova.compute.resource_tracker [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:46:05.188 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:46:05.188 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:46:05.188 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:46:05.189 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:46:05.189 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] disk limit not specified, defaulting to unlimited 2015-08-07 17:46:05.209 DEBUG nova.compute.resources.vcpu [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:46:05.209 DEBUG nova.compute.resources.vcpu [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:46:05.210 INFO nova.compute.claims [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Claim successful 2015-08-07 17:46:05.464 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.282s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:05.643 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:05.720 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.076s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:05.721 DEBUG nova.compute.utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:46:05.725 13318 DEBUG nova.compute.manager [-] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:46:05.726 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:05.735 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.703s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:05.749 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:46:05.752 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:46ca7ff7-782b-3c79-ad45-d6f793292aa2, VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:05.761 DEBUG nova.virt.xenapi.vm_utils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:a5f86599-198f-7c29-5f3b-60f077b0cc5d for VM OpaqueRef:46ca7ff7-782b-3c79-ad45-d6f793292aa2, VDI OpaqueRef:7166a5d6-fe6b-a301-efd4-630c0982cf4d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:05.762 DEBUG nova.objects.instance [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:05.880 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:06.063 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:06.064 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:06.064 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:06.072 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:06.073 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Injecting hostname (tempest.common.compute-instance-1517593106) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:46:06.075 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:06.083 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:06.085 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:46:06.085 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:06.120 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:46:06.138 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:46:06.139 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:06.248 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_nwinfo" :: held 0.163s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:06.249 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:06.319 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:46:06.329 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:06.415 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:46:06.422 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:46:06.431 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VIF OpaqueRef:06b3cd6f-c5f0-3b3c-ce71-f61a09f47fb8, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:46:06.432 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:06.847 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:07.243 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:b684f6c8-5eeb-dc58-50f1-41ee8e9407c1 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:46:07.444 13318 DEBUG nova.network.base_api [-] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:9c:3a:b4', 'active': False, 'type': u'bridge', 'id': u'34e8d6bb-b3f7-41b8-9edb-7ec782c97f2f', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:07.470 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:07.471 13318 DEBUG nova.compute.manager [-] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:9c:3a:b4', 'active': False, 'type': u'bridge', 'id': u'34e8d6bb-b3f7-41b8-9edb-7ec782c97f2f', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:46:07.866 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.537s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:07.866 INFO nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 1.55 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:46:08.482 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:08.686 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:08.864 DEBUG nova.compute.manager [req-d757c681-8bd5-42d9-a8ed-533b4d0dc713 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:08.876 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:46:08.895 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:46:08.896 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:09.090 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:5d414fe8-311b-2867-14cf-176b194196db, VDI OpaqueRef:b684f6c8-5eeb-dc58-50f1-41ee8e9407c1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:09.098 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:b9d4a5ac-3b59-c09a-dc64-cded5e408de2 for VM OpaqueRef:5d414fe8-311b-2867-14cf-176b194196db, VDI OpaqueRef:b684f6c8-5eeb-dc58-50f1-41ee8e9407c1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:09.403 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:46:09.407 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:09.417 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:9311e6e3-4773-5b34-d42f-e58c38fdcec7 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:09.417 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:9311e6e3-4773-5b34-d42f-e58c38fdcec7 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:46:09.418 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:10.607 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.189s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:10.608 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:9311e6e3-4773-5b34-d42f-e58c38fdcec7 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:46:10.611 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:9311e6e3-4773-5b34-d42f-e58c38fdcec7 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:46:10.689 WARNING nova.virt.configdrive [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:46:10.689 DEBUG nova.objects.instance [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 5efe11cd-8475-4ce1-b0c5-86c4d930d74f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:10.724 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpco4uD6/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpApRTv5 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:10.823 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpco4uD6/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpApRTv5" returned: 0 in 0.099s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:10.828 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpco4uD6/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:12.861 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:46:12.886 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:13.195 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:46:13.196 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:46:13.197 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:13.202 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:13.204 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:13.494 DEBUG nova.virt.xenapi.vmops [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:13.851 DEBUG nova.compute.manager [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:14.351 DEBUG oslo_concurrency.lockutils [req-643ce0fd-618e-4151-84d4-433f90e4911a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "_locked_do_build_and_run_instance" :: held 35.170s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:14.803 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "do_reserve" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:14.810 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpco4uD6/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.982s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:14.812 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:14.900 DEBUG nova.compute.utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Using /dev/xvd instead of /dev/vd get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:46:14.940 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" released by "do_reserve" :: held 0.137s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:15.086 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:15.096 DEBUG oslo_concurrency.processutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.284s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:15.097 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:46:15.098 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:15.365 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "do_attach_volume" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:15.366 INFO nova.compute.manager [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Attaching volume a01057b3-070e-4f81-8c61-20ad9d422110 to /dev/xvdb 2015-08-07 17:46:15.370 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] REQ: curl -g -i -X GET http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110 -H "User-Agent: python-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}4e8cb0df11d791ca7f3b97b007f8e2c8eed87494" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:46:15.595 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] RESP: [200] content-length: 971 x-compute-request-id: req-b201f33e-d729-407f-ae2b-6213f61986d4 connection: keep-alive date: Fri, 07 Aug 2015 17:46:15 GMT content-type: application/json x-openstack-request-id: req-b201f33e-d729-407f-ae2b-6213f61986d4 RESP BODY: {"volume": {"attachments": [], "links": [{"href": "http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110", "rel": "self"}, {"href": "http://192.168.33.1:8776/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110", "rel": "bookmark"}], "availability_zone": "nova", "encrypted": false, "os-volume-replication:extended_status": null, "volume_type": "lvmdriver-1", "snapshot_id": null, "id": "a01057b3-070e-4f81-8c61-20ad9d422110", "size": 1, "user_id": "da936c8d9f124177aa99a3021442fb3f", "os-vol-tenant-attr:tenant_id": "540c7ca9d6cc474f9ee1a73634b4d224", "metadata": {}, "status": "attaching", "description": null, "multiattach": false, "source_volid": null, "consistencygroup_id": null, "name": "tempest-ServerRescueNegativeTestJSON_volume-1585733813", "bootable": "false", "created_at": "2015-08-07T17:46:12.000000", "os-volume-replication:driver_data": null, "replication_status": "disabled"}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:46:15.597 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}4e8cb0df11d791ca7f3b97b007f8e2c8eed87494" -d '{"os-initialize_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:46:15.700 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.602s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:15.707 DEBUG nova.compute.manager [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 17:46:15.717 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:46:15.718 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:5d414fe8-311b-2867-14cf-176b194196db, VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:15.726 DEBUG nova.virt.xenapi.vm_utils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:d173a4f7-2227-2bcf-d978-2d89e2d76efa for VM OpaqueRef:5d414fe8-311b-2867-14cf-176b194196db, VDI OpaqueRef:8639feb5-c7d1-6978-644c-3acf99f85867. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:15.727 DEBUG nova.objects.instance [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 5efe11cd-8475-4ce1-b0c5-86c4d930d74f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:15.851 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:15.851 DEBUG nova.compute.resource_tracker [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 17:46:15.861 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 17:46:15.862 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total memory: 8187 MB, used: 926.00 MB 2015-08-07 17:46:15.862 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] memory limit: 12280.50 MB, free: 11354.50 MB 2015-08-07 17:46:15.863 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:46:15.863 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] disk limit not specified, defaulting to unlimited 2015-08-07 17:46:15.885 DEBUG nova.compute.resources.vcpu [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:46:15.886 DEBUG nova.compute.resources.vcpu [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:46:15.887 INFO nova.compute.claims [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Claim successful 2015-08-07 17:46:15.916 INFO nova.compute.resource_tracker [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Updating from migration 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 2015-08-07 17:46:15.987 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "resize_claim" :: held 0.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:15.988 INFO nova.compute.manager [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrating 2015-08-07 17:46:16.021 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:16.038 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:16.163 DEBUG nova.network.base_api [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:16.187 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:16.459 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:16.462 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "store_meta" :: held 0.003s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:16.462 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:16.469 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "store_auto_disk_config" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:16.470 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Injecting hostname (tempest.common.compute-instance-1195700289) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:46:16.471 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:16.476 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:16.476 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:46:16.477 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:16.513 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:16.610 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "update_nwinfo" :: held 0.133s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:16.611 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:16.682 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:46:16.693 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:16.694 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:46:16.794 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:46:16.798 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:46:16.804 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Created VIF OpaqueRef:9fad1785-6d01-882b-bc06-616fd228e375, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:46:16.805 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:17.010 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:17.162 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.469s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:17.168 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.183 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD e6c9cc27-9f53-4eb3-b93e-15c0626b66bb has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.189 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.195 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 13edb23e-e06e-4bba-9ec5-747d004b90e0 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.209 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 8bbe08a0-c32e-40eb-a34d-97ec9902b7af has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.215 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 45f0444b-41c6-401c-93d6-0e9b8dcc5f76 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.224 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 60495381-4db4-4b06-83c7-7b5f778290c3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.231 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD f6006734-242b-4374-b2ac-eac20be70bf3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.240 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:17.248 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:46:17.656 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] RESP: [200] content-length: 449 x-compute-request-id: req-7ce1be63-5430-4799-b6a6-2787d1ad7c11 connection: keep-alive date: Fri, 07 Aug 2015 17:46:17 GMT content-type: application/json x-openstack-request-id: req-7ce1be63-5430-4799-b6a6-2787d1ad7c11 RESP BODY: {"connection_info": {"driver_volume_type": "iscsi", "data": {"auth_password": "QKTEi7pqc5BYbohh", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-a01057b3-070e-4f81-8c61-20ad9d422110", "target_portal": "104.130.119.114:3260", "volume_id": "a01057b3-070e-4f81-8c61-20ad9d422110", "target_lun": 1, "access_mode": "rw", "auth_username": "yN74JEhUn4pHQvohpsKJ", "auth_method": "CHAP"}}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:46:17.659 DEBUG nova.virt.xenapi.volume_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] (vol_id,host,port,iqn): (a01057b3-070e-4f81-8c61-20ad9d422110,104.130.119.114,3260,iqn.2010-10.org.openstack:volume-a01057b3-070e-4f81-8c61-20ad9d422110) _parse_volume_info /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:80 2015-08-07 17:46:17.664 DEBUG nova.virt.xenapi.volume_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Introducing SR tempSR-a01057b3-070e-4f81-8c61-20ad9d422110 introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:119 2015-08-07 17:46:17.671 DEBUG nova.virt.xenapi.volume_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating PBD for SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:126 2015-08-07 17:46:17.686 DEBUG nova.virt.xenapi.volume_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:129 2015-08-07 17:46:18.495 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:46:18.501 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:18.502 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:46:18.874 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.373s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:18.879 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD c5c1876b-559f-40de-aaeb-35b6f91ecbf3 has parent 16ba0512-dc90-407d-930e-391943efa733 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:18.885 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 16ba0512-dc90-407d-930e-391943efa733 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:46:18.893 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:19.044 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrating VHD '16ba0512-dc90-407d-930e-391943efa733' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:46:19.325 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:46:20.428 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 17:46:20.429 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:20.613 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 17:46:20.621 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 17:46:21.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:21.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:46:21.710 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 13 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:46:21.711 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 1d45573b-0e05-4067-ab3e-56616334b7d9] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:21.847 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 127dae8a-26f4-4d23-862d-d23bbd64ca16] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:21.994 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 420208c3-bf05-4834-98b5-1c8afda79f97] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.105 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 6c2e3b38-f3d2-472d-a728-f41167dcf407] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.266 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 115845a8-7055-44f5-a3f5-222bd027f1ed] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.591 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4443fc1d-35f7-46c6-8139-e22c74dd9d86] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.601 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:46:22.616 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:22.765 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 65774979-e5fe-4d9f-9f8c-48214aed768d] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.782 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:46:22.782 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:46:22.783 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:22.788 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:22.788 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:22.913 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e7e452a5-f0b3-4f79-9f03-57be081501e5] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:22.964 DEBUG nova.virt.xenapi.vmops [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:23.035 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 7aa9beb6-cabe-4ae3-a172-162c757bd718] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:23.145 DEBUG nova.compute.manager [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:23.189 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 393149d8-bc0d-4f72-afbc-954d8344f5e5] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:23.344 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: c1501aed-5580-4a88-bf3b-0761bd0b186e] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:23.437 DEBUG oslo_concurrency.lockutils [req-db4bf694-e571-4a3a-bd4c-0043a9735783 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "_locked_do_build_and_run_instance" :: held 18.549s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:23.505 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 31a2fd34-bbcb-4b50-83e0-dc6c7369b479] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:23.652 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: aa0c0819-08a6-4a79-93eb-f58885697f5b] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:46:23.787 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 22.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:24.579 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Migrating VHD '0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 17:46:25.067 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:25.270 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:25.438 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:26.118 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:26.119 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:26.119 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:26.122 INFO nova.compute.manager [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Terminating instance 2015-08-07 17:46:26.124 INFO nova.virt.xenapi.vmops [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Destroying VM 2015-08-07 17:46:26.131 DEBUG nova.virt.xenapi.vm_utils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:46:26.179 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:26.179 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:27.630 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:27.753 DEBUG nova.network.base_api [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:27.779 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:27.824 DEBUG nova.virt.xenapi.vmops [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:46:27.837 DEBUG nova.virt.xenapi.vm_utils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI e6c9cc27-9f53-4eb3-b93e-15c0626b66bb is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:27.844 DEBUG nova.virt.xenapi.vm_utils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI 9009681e-32db-49bf-9716-e0c1b839902b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:27.976 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:46:27.976 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 17:46:28.445 DEBUG nova.virt.xenapi.vmops [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:46:28.457 DEBUG nova.virt.xenapi.vm_utils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:46:28.458 DEBUG nova.compute.manager [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:46:28.481 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:28.481 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:46:28.845 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.364s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:29.332 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:46:29.341 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:46:29.342 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:f3baa5fc-ce32-fc8d-980e-20dd67927a4e, VDI OpaqueRef:d8a1ffca-fe0e-4fdd-c1f8-8e83874cf5d0 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:29.348 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:37848727-31c6-e15f-49e7-e40a3ee86430 for VM OpaqueRef:f3baa5fc-ce32-fc8d-980e-20dd67927a4e, VDI OpaqueRef:d8a1ffca-fe0e-4fdd-c1f8-8e83874cf5d0. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:29.606 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:46:29.609 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:29.620 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:dfc8148a-167d-cf11-25bb-6666071f1e11 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:29.621 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:dfc8148a-167d-cf11-25bb-6666071f1e11 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:46:29.621 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:29.676 DEBUG nova.compute.manager [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:46:04Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=70,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=5efe11cd-8475-4ce1-b0c5-86c4d930d74f,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:46:05Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:46:29.811 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:29.813 DEBUG nova.objects.instance [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 5efe11cd-8475-4ce1-b0c5-86c4d930d74f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:29.874 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.063s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:30.051 DEBUG oslo_concurrency.lockutils [req-ed4546b6-5167-4346-8e41-423b889756c7 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "5efe11cd-8475-4ce1-b0c5-86c4d930d74f" released by "do_terminate_instance" :: held 3.933s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:30.576 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 0.955s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:30.577 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:dfc8148a-167d-cf11-25bb-6666071f1e11 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:46:30.578 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:dfc8148a-167d-cf11-25bb-6666071f1e11 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:46:30.639 WARNING nova.virt.configdrive [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:46:30.639 DEBUG nova.objects.instance [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:30.660 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmpPbyY7j/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpXDINau execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:30.746 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmpPbyY7j/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpXDINau" returned: 0 in 0.085s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:30.753 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpPbyY7j/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:31.263 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:31.407 INFO nova.compute.manager [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Starting instance... 2015-08-07 17:46:31.645 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:31.646 DEBUG nova.compute.resource_tracker [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:46:31.652 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:46:31.653 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Total memory: 8187 MB, used: 991.00 MB 2015-08-07 17:46:31.653 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] memory limit: 12280.50 MB, free: 11289.50 MB 2015-08-07 17:46:31.653 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:46:31.653 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] disk limit not specified, defaulting to unlimited 2015-08-07 17:46:31.675 DEBUG nova.compute.resources.vcpu [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:46:31.676 DEBUG nova.compute.resources.vcpu [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:46:31.676 INFO nova.compute.claims [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Claim successful 2015-08-07 17:46:32.024 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "instance_claim" :: held 0.379s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:32.353 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:32.516 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.163s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:32.518 DEBUG nova.compute.utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:46:32.522 13318 DEBUG nova.compute.manager [-] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:46:32.524 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-535c4da4-7eef-4c5e-b79f-4647f6857432" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:33.807 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:46:33.824 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:46:33.824 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:34.021 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:46:34.028 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:34.096 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpPbyY7j/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.344s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:34.098 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:34.429 DEBUG oslo_concurrency.processutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.331s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:34.433 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:46:34.435 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:34.651 DEBUG nova.virt.xenapi.volumeops [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Connect volume to hypervisor: {u'access_mode': u'rw', u'target_discovered': False, u'encrypted': False, u'qos_specs': None, u'target_iqn': u'iqn.2010-10.org.openstack:volume-a01057b3-070e-4f81-8c61-20ad9d422110', u'target_portal': u'104.130.119.114:3260', u'volume_id': u'a01057b3-070e-4f81-8c61-20ad9d422110', u'target_lun': 1, u'auth_password': u'QKTEi7pqc5BYbohh', u'auth_username': u'yN74JEhUn4pHQvohpsKJ', u'auth_method': u'CHAP'} _connect_hypervisor_to_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:94 2015-08-07 17:46:34.667 DEBUG nova.virt.xenapi.volume_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] {'sm_config': {'LUNid': '1', 'SCSIid': '33000000100000001'}, 'managed': False, 'snapshots': [], 'allowed_operations': ['forget', 'destroy', 'copy', 'snapshot'], 'on_boot': 'persist', 'name_description': '', 'read_only': False, 'uuid': 'e55110f1-56b1-d008-537d-c3081511f123', 'storage_lock': False, 'name_label': '', 'tags': [], 'location': 'e55110f1-56b1-d008-537d-c3081511f123', 'metadata_of_pool': 'OpaqueRef:NULL', 'type': 'user', 'sharable': False, 'snapshot_time': , 'parent': 'OpaqueRef:NULL', 'missing': False, 'xenstore_data': {}, 'crash_dumps': [], 'virtual_size': '1073741824', 'is_a_snapshot': False, 'current_operations': {}, 'snapshot_of': 'OpaqueRef:NULL', 'SR': 'OpaqueRef:74e79c37-53e3-4e3a-c569-69c76b9a76d5', 'other_config': {}, 'physical_utilisation': '0', 'allow_caching': False, 'metadata_latest': False, 'VBDs': []} introduce_vdi /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:176 2015-08-07 17:46:35.004 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Cloned VDI OpaqueRef:5697b965-ad67-029e-7cc0-bdad94b3812f from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:46:35.064 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:35.084 13318 DEBUG nova.network.base_api [-] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:64:a6:a9', 'active': False, 'type': u'bridge', 'id': u'7a44855e-3ada-4fa6-b855-a716c735fa19', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:35.112 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-535c4da4-7eef-4c5e-b79f-4647f6857432" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:35.113 13318 DEBUG nova.compute.manager [-] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:64:a6:a9', 'active': False, 'type': u'bridge', 'id': u'7a44855e-3ada-4fa6-b855-a716c735fa19', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:46:35.227 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.792s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:35.237 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:46:35.238 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:f3baa5fc-ce32-fc8d-980e-20dd67927a4e, VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:35.239 INFO nova.virt.xenapi.volumeops [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Connected volume (vdi_uuid): e55110f1-56b1-d008-537d-c3081511f123 2015-08-07 17:46:35.240 DEBUG nova.virt.xenapi.volumeops [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Attach_volume vdi: OpaqueRef:c4dc2d8e-c605-bc2b-f037-5a47d70a7420 vm: OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376 _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:114 2015-08-07 17:46:35.240 DEBUG nova.virt.xenapi.vm_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:c4dc2d8e-c605-bc2b-f037-5a47d70a7420 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:35.247 DEBUG nova.virt.xenapi.vm_utils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:42ef393a-6022-0d77-0b89-e442fd023fac for VM OpaqueRef:f3baa5fc-ce32-fc8d-980e-20dd67927a4e, VDI OpaqueRef:9a6c6524-7105-3a5f-d616-cb74a5fc4f54. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:35.248 DEBUG nova.objects.instance [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:35.256 DEBUG nova.virt.xenapi.vm_utils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:8861ab0a-645c-1b22-16f0-4da88fd8d15f for VM OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376, VDI OpaqueRef:c4dc2d8e-c605-bc2b-f037-5a47d70a7420. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:35.262 DEBUG nova.virt.xenapi.volumeops [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD: OpaqueRef:8861ab0a-645c-1b22-16f0-4da88fd8d15f _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:124 2015-08-07 17:46:35.263 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:35.343 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:35.344 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:35.345 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:35.360 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_auto_disk_config" :: held 0.016s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:35.361 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:46:35.361 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:35.536 DEBUG oslo_concurrency.lockutils [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_nwinfo" :: held 0.175s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:35.537 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:46:35.544 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:46:35.551 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VIF OpaqueRef:3f28664b-c2f8-e2b0-d27a-bd551edb88fe, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:46:35.552 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:35.666 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.638s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:35.666 INFO nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Image creation data, cacheable: True, downloaded: False duration: 1.64 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:46:36.288 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:36.459 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:36.492 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376" released by "synchronized_plug" :: held 1.228s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:36.492 INFO nova.virt.xenapi.volumeops [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Dev 1 attached to instance instance-0000003f 2015-08-07 17:46:36.530 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}4e8cb0df11d791ca7f3b97b007f8e2c8eed87494" -d '{"os-attach": {"instance_uuid": "b197c990-eecd-403b-b8d7-9e57e7053a16", "mountpoint": "/dev/xvdb", "mode": "rw"}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:46:36.609 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:46:36.623 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:46:36.623 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:36.808 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:2ffefee9-f3cd-bb66-6684-1c4705cf89ba, VDI OpaqueRef:5697b965-ad67-029e-7cc0-bdad94b3812f ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:36.816 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:0f597729-5ac2-4f9e-f3a3-85a9c3f0b8bb for VM OpaqueRef:2ffefee9-f3cd-bb66-6684-1c4705cf89ba, VDI OpaqueRef:5697b965-ad67-029e-7cc0-bdad94b3812f. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:37.105 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:46:37.108 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:37.116 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:aee40d8f-93e1-d9dc-1ed1-db17335fe6e6 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:37.117 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:aee40d8f-93e1-d9dc-1ed1-db17335fe6e6 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:46:37.118 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:38.321 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.203s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:38.322 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Plugging VBD OpaqueRef:aee40d8f-93e1-d9dc-1ed1-db17335fe6e6 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:46:38.325 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VBD OpaqueRef:aee40d8f-93e1-d9dc-1ed1-db17335fe6e6 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:46:38.405 WARNING nova.virt.configdrive [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:46:38.406 DEBUG nova.objects.instance [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `ec2_ids' on Instance uuid 535c4da4-7eef-4c5e-b79f-4647f6857432 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:38.443 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): genisoimage -o /tmp/tmpxMGvNi/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIXBND3 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:38.554 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "genisoimage -o /tmp/tmpxMGvNi/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIXBND3" returned: 0 in 0.111s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:38.559 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpxMGvNi/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:38.645 DEBUG keystoneclient.session [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] RESP: [202] date: Fri, 07 Aug 2015 17:46:38 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-e276364e-01fc-4ac0-8026-de1ac90facb3 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:46:38.730 DEBUG oslo_concurrency.lockutils [req-70a58d5e-f11a-442d-837a-6055a34fcc4e tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" released by "do_attach_volume" :: held 23.365s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:39.577 INFO nova.compute.manager [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Rescuing 2015-08-07 17:46:39.580 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Acquired semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:39.708 DEBUG nova.network.base_api [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:39.731 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Releasing semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:39.791 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:46:41.581 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:46:41.595 DEBUG nova.virt.xenapi.vmops [req-1775fc0d-c79e-4ebe-9495-c9fcf5679951 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:42.037 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpxMGvNi/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.478s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:42.039 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:42.299 DEBUG oslo_concurrency.processutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.259s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:42.301 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:46:42.303 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:43.103 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.800s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:43.110 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Destroying VBD for VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:46:43.112 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Creating disk-type VBD for VM OpaqueRef:2ffefee9-f3cd-bb66-6684-1c4705cf89ba, VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:43.120 DEBUG nova.virt.xenapi.vm_utils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Created VBD OpaqueRef:d0272490-2c8c-8d21-d4d4-8670a528d4cd for VM OpaqueRef:2ffefee9-f3cd-bb66-6684-1c4705cf89ba, VDI OpaqueRef:3a713756-5764-0967-dfd6-add764c13655. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:43.121 DEBUG nova.objects.instance [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `pci_devices' on Instance uuid 535c4da4-7eef-4c5e-b79f-4647f6857432 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:43.225 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:43.413 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:43.414 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" released by "store_meta" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:43.415 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:43.423 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:43.424 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Injecting hostname (tempest.common.compute-instance-549268954) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:46:43.425 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:43.434 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:43.435 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:46:43.437 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:43.603 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" released by "update_nwinfo" :: held 0.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:43.604 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:43.711 WARNING nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] VM already halted, skipping shutdown... 2015-08-07 17:46:43.735 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:46:43.760 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 9 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:43.895 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:46:43.902 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:46:43.910 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Created VIF OpaqueRef:6837e828-cdad-8137-8a56-855ad66e4424, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:46:43.933 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:43.976 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:44.122 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:44.635 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:44.775 DEBUG nova.network.base_api [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:44.811 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:44.825 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Cloned VDI OpaqueRef:da10b432-2569-4618-2c1c-32076e3d6a87 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:46:44.839 DEBUG nova.compute.manager [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Hypervisor driver does not support instance shared storage check, assuming it's not on shared storage _is_instance_storage_shared /opt/stack/new/nova/nova/compute/manager.py:845 2015-08-07 17:46:44.839 INFO nova.virt.xenapi.vmops [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VM 2015-08-07 17:46:44.851 DEBUG nova.virt.xenapi.vm_utils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:46:45.059 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:45.392 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.416s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:45.393 INFO nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Image creation data, cacheable: True, downloaded: False duration: 1.43 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:46:45.785 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:45.785 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:45.786 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.73 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:45.978 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 18 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:46.133 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 27 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:46.259 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:46:46.275 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:46:46.276 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 36 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:46.425 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] auto_disk_config value not found inrescue image_properties. Setting value to False _attach_disks /opt/stack/new/nova/nova/virt/xenapi/vmops.py:714 2015-08-07 17:46:46.426 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:da10b432-2569-4618-2c1c-32076e3d6a87 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:46.434 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:b2bb02fe-cff1-642b-a788-7b8f790a4621 for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:da10b432-2569-4618-2c1c-32076e3d6a87. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:46.513 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:46.514 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:46.751 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:46:46.755 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:46.763 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:d168fdd4-4f38-8365-dca6-b48e6c44ade7 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:46.765 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:d168fdd4-4f38-8365-dca6-b48e6c44ade7 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:46:46.765 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:47.196 DEBUG nova.virt.xenapi.vmops [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:46:47.211 DEBUG nova.virt.xenapi.vm_utils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI fb56cb25-aa32-444c-97ba-25047410a535 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:47.222 DEBUG nova.virt.xenapi.vm_utils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI b1b0ab26-260f-4de1-99bb-22a4805092ef is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:47.978 DEBUG nova.virt.xenapi.vmops [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:46:47.999 DEBUG nova.virt.xenapi.vm_utils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:46:48.051 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:48.099 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "drop_move_claim" :: held 0.047s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:48.111 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.345s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:48.111 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Plugging VBD OpaqueRef:d168fdd4-4f38-8365-dca6-b48e6c44ade7 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:46:48.126 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VBD OpaqueRef:d168fdd4-4f38-8365-dca6-b48e6c44ade7 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:46:48.163 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:46:48.209 WARNING nova.virt.configdrive [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:46:48.210 DEBUG nova.objects.instance [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `ec2_ids' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:48.242 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): genisoimage -o /tmp/tmpXWH3qE/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpmUrz3Z execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:48.324 DEBUG nova.network.base_api [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:46:48.332 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "genisoimage -o /tmp/tmpXWH3qE/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpmUrz3Z" returned: 0 in 0.090s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:48.337 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXWH3qE/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:48.439 DEBUG oslo_concurrency.lockutils [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:46:48.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:48.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:46:48.622 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:46:48.622 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:49.963 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:46:49.988 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:50.182 DEBUG nova.virt.xenapi.vmops [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:50.326 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:46:50.327 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:46:50.328 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:50.332 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "xenstore-535c4da4-7eef-4c5e-b79f-4647f6857432" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:50.333 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:50.623 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:50.623 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:46:50.624 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:50.693 DEBUG nova.virt.xenapi.vmops [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:51.007 DEBUG nova.compute.manager [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:51.382 DEBUG oslo_concurrency.lockutils [req-efd0a2bd-ab3d-47db-bef6-f5e6e93eceea tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" released by "_locked_do_build_and_run_instance" :: held 20.119s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:51.655 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpXWH3qE/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.318s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:51.657 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:46:51.957 DEBUG oslo_concurrency.processutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.300s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:46:51.966 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:46:51.967 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:52.307 DEBUG oslo_concurrency.lockutils [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:52.307 DEBUG nova.compute.manager [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:52.332 DEBUG nova.compute.manager [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:46:52.342 DEBUG nova.virt.xenapi.vm_utils [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:46:52.805 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.838s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:52.814 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Destroying VBD for VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:46:52.815 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:52.824 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:90bc8b84-8fbf-f4ce-1239-460c0a77633f for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:6a74a61c-7972-8606-d46f-806e3636982b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:52.825 DEBUG nova.objects.instance [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `pci_devices' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:46:52.915 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 45 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:53.061 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:53.061 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:53.062 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:53.080 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "store_auto_disk_config" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:53.081 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting hostname (RESCUE-tempest.common.compute-instance-1031488919) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:46:53.081 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:53.094 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:53.095 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:46:53.095 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:53.286 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_nwinfo" :: held 0.191s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:53.287 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 55 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:53.477 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:46:53.483 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:46:53.493 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Created VIF OpaqueRef:96a89edb-f237-95f5-2dd0-ea59086688c2, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:46:53.494 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 64 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:53.671 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Creating disk-type VBD for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:46:53.679 DEBUG nova.virt.xenapi.vm_utils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Created VBD OpaqueRef:f1027975-e73f-a1f8-0f52-df5273605ae8 for VM OpaqueRef:031344b8-3c74-ceb0-5646-abc89375d223, VDI OpaqueRef:7c84b050-5c44-b399-8b30-5a24de924861. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:46:53.680 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 73 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:46:53.832 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:46:54.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:54.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:54.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:54.732 DEBUG nova.compute.manager [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:46:54.909 DEBUG oslo_concurrency.lockutils [req-1ae17224-109b-4ea0-a4e5-d71bc7452499 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" released by "do_stop_instance" :: held 2.603s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:55.053 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:55.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:46:55.562 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:46:55.563 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:46:55.776 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:55.777 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:46:56.426 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.650s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:56.658 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: -1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:46:56.658 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:46:56.659 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=721MB free_disk=16GB free_vcpus=-1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:46:56.659 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:57.099 INFO nova.compute.manager [req-5c7f9184-55d8-40d4-b928-f125bdac9cbe tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Updating instance to original state: 'active' 2015-08-07 17:46:57.135 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 9 2015-08-07 17:46:57.136 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=926MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=9 pci_stats=None 2015-08-07 17:46:57.197 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:46:57.197 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.538s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:57.198 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:46:57.589 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:57.591 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:46:57.591 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:46:57.593 INFO nova.compute.manager [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Terminating instance 2015-08-07 17:46:57.596 INFO nova.virt.xenapi.vmops [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Destroying VM 2015-08-07 17:46:57.609 WARNING nova.virt.xenapi.vm_utils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] VM already halted, skipping shutdown... 2015-08-07 17:46:57.621 DEBUG nova.virt.xenapi.vmops [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:46:57.632 DEBUG nova.virt.xenapi.vm_utils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI ec4cae55-8466-413a-97ba-f5002092d96c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:57.641 DEBUG nova.virt.xenapi.vm_utils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] VDI f75c27d1-dd54-4be5-8fbb-5b0cf35f623e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:46:58.385 DEBUG nova.virt.xenapi.vmops [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:46:58.400 DEBUG nova.virt.xenapi.vm_utils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:46:58.402 DEBUG nova.compute.manager [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:46:59.460 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:47:00.095 DEBUG nova.compute.manager [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:46:30Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=72,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=535c4da4-7eef-4c5e-b79f-4647f6857432,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:46:32Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:47:00.316 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:00.318 DEBUG nova.objects.instance [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lazy-loading `numa_topology' on Instance uuid 535c4da4-7eef-4c5e-b79f-4647f6857432 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:00.407 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "compute_resources" released by "update_usage" :: held 0.090s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:00.768 DEBUG oslo_concurrency.lockutils [req-76020520-c3db-4ff5-8b77-7893e09c6d19 tempest-ServersTestJSON-1516276945 tempest-ServersTestJSON-1601395983] Lock "535c4da4-7eef-4c5e-b79f-4647f6857432" released by "do_terminate_instance" :: held 3.179s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:03.266 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:47:03.376 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 82 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:04.048 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:47:04.050 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:47:04.050 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:04.068 DEBUG oslo_concurrency.lockutils [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenstore-b197c990-eecd-403b-b8d7-9e57e7053a16" released by "update_hostname" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:04.069 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 91 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:04.471 DEBUG nova.virt.xenapi.vmops [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:05.024 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:47:05.060 DEBUG nova.compute.manager [req-70e0238d-f069-42b0-b2e2-10d86cae67b2 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:47:05.079 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:05.080 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:05.132 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:05.812 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.733s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:05.833 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 16ba0512-dc90-407d-930e-391943efa733 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.848 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 16ba0512-dc90-407d-930e-391943efa733 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.881 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD aa5e0a11-aa40-4aee-bd3e-129a97e05e7d has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.906 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 16ba0512-dc90-407d-930e-391943efa733 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.918 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 13edb23e-e06e-4bba-9ec5-747d004b90e0 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.944 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 8bbe08a0-c32e-40eb-a34d-97ec9902b7af has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.950 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 45f0444b-41c6-401c-93d6-0e9b8dcc5f76 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.961 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 60495381-4db4-4b06-83c7-7b5f778290c3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.974 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD f6006734-242b-4374-b2ac-eac20be70bf3 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.986 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:05.999 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:47:06.495 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:06.496 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:06.649 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 5 instances in the database and 4 instances on the hypervisor. 2015-08-07 17:47:06.651 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:47:06.652 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid af2ef72d-4895-4de0-bd40-aaa2ac498091 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:47:06.652 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid b197c990-eecd-403b-b8d7-9e57e7053a16 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:47:06.653 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:47:06.653 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:47:06.654 13318 DEBUG oslo_concurrency.lockutils [-] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:06.655 13318 DEBUG oslo_concurrency.lockutils [-] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:06.656 13318 DEBUG oslo_concurrency.lockutils [-] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:06.656 13318 INFO nova.compute.manager [-] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] During sync_power_state the instance has a pending task (unrescuing). Skip. 2015-08-07 17:47:06.656 13318 DEBUG oslo_concurrency.lockutils [-] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:06.657 13318 DEBUG oslo_concurrency.lockutils [-] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:06.658 13318 DEBUG oslo_concurrency.lockutils [-] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "query_driver_power_state_and_sync" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:06.658 13318 INFO nova.compute.manager [-] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] During sync_power_state the instance has a pending task (shelving). Skip. 2015-08-07 17:47:06.658 13318 DEBUG oslo_concurrency.lockutils [-] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:06.659 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:06.731 INFO nova.compute.manager [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Unrescuing 2015-08-07 17:47:06.732 DEBUG oslo_concurrency.lockutils [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Acquired semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:47:06.844 13318 DEBUG oslo_concurrency.lockutils [-] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "query_driver_power_state_and_sync" :: held 0.190s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:06.919 13318 DEBUG oslo_concurrency.lockutils [-] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "query_driver_power_state_and_sync" :: held 0.264s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:06.979 13318 DEBUG oslo_concurrency.lockutils [-] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "query_driver_power_state_and_sync" :: held 0.322s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:07.061 DEBUG nova.network.base_api [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:be:cf:fd', 'active': False, 'type': u'bridge', 'id': u'65310656-cc34-4b05-84ec-9c385b0c36a8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:47:07.107 DEBUG oslo_concurrency.lockutils [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Releasing semaphore "refresh_cache-b197c990-eecd-403b-b8d7-9e57e7053a16" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:47:08.162 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:08.163 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:10.099 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.937s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:10.107 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 9473e243-0ab9-48a6-9428-d92dbbd75b9b _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:10.108 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Parent 9473e243-0ab9-48a6-9428-d92dbbd75b9b not yet in parent list ['16ba0512-dc90-407d-930e-391943efa733', '4027f457-a9bb-499a-8844-79fc67f11377'], waiting for coalesce... _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2115 2015-08-07 17:47:12.746 DEBUG nova.virt.xenapi.vm_utils [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI aa5e0a11-aa40-4aee-bd3e-129a97e05e7d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:12.764 DEBUG nova.virt.xenapi.vm_utils [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI b5c1f1cb-b240-4dd4-b4db-b3f1e2cc9223 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:12.782 DEBUG nova.virt.xenapi.vm_utils [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 45f0444b-41c6-401c-93d6-0e9b8dcc5f76 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:13.683 DEBUG nova.virt.xenapi.vmops [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:47:15.110 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:15.111 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:15.113 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:16.032 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.921s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:16.042 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c has parent 16ba0512-dc90-407d-930e-391943efa733 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:16.042 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Coalesce detected, because parent is: 16ba0512-dc90-407d-930e-391943efa733 _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2118 2015-08-07 17:47:16.058 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:16.059 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:16.597 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.538s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:16.612 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 7e24dc43-8027-46c8-85d8-ba0048b095c4 has parent 16ba0512-dc90-407d-930e-391943efa733 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:16.621 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VHD 16ba0512-dc90-407d-930e-391943efa733 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:47:16.814 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:16.869 DEBUG nova.virt.xenapi.client.session [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:47:16.878 INFO nova.compute.manager [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Starting instance... 2015-08-07 17:47:17.120 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:17.121 DEBUG nova.compute.resource_tracker [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:47:17.130 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:47:17.130 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Total memory: 8187 MB, used: 857.00 MB 2015-08-07 17:47:17.131 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] memory limit: 12280.50 MB, free: 11423.50 MB 2015-08-07 17:47:17.131 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:47:17.132 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] disk limit not specified, defaulting to unlimited 2015-08-07 17:47:17.164 DEBUG nova.compute.resources.vcpu [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Total CPUs: 8 VCPUs, used: 5.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:47:17.165 DEBUG nova.compute.resources.vcpu [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:47:17.166 INFO nova.compute.claims [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Claim successful 2015-08-07 17:47:17.603 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" released by "instance_claim" :: held 0.483s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:18.034 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:18.281 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" released by "update_usage" :: held 0.247s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:18.282 DEBUG nova.compute.utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:47:18.286 13318 DEBUG nova.compute.manager [-] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:47:18.288 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:47:19.220 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:47:19.239 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:47:19.240 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:19.581 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:47:19.593 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:21.273 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Cloned VDI OpaqueRef:723e4b38-0fb9-b054-cec7-1bfe2620179d from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:47:21.924 13318 DEBUG nova.network.base_api [-] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:43:1d:93', 'active': False, 'type': u'bridge', 'id': u'15cc4f13-3ff6-4a03-80a9-716ed1e3e959', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:47:21.965 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:47:21.966 13318 DEBUG nova.compute.manager [-] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.5'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:43:1d:93', 'active': False, 'type': u'bridge', 'id': u'15cc4f13-3ff6-4a03-80a9-716ed1e3e959', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:47:22.458 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.865s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:22.459 INFO nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Image creation data, cacheable: True, downloaded: False duration: 2.88 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:47:22.893 DEBUG nova.virt.xenapi.vmops [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Finished snapshot and upload for VM, duration: 17.87 secs for image a97cb416-6d80-4939-be35-09f427a551a8 snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:47:22.894 DEBUG nova.compute.manager [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:47:23.167 WARNING nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] VM already halted, skipping shutdown... 2015-08-07 17:47:23.167 DEBUG nova.compute.manager [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:47:23.320 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:47:23.567 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:23.617 DEBUG nova.network.base_api [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:47:23.663 DEBUG oslo_concurrency.lockutils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:47:23.692 INFO nova.virt.xenapi.vmops [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VM 2015-08-07 17:47:23.707 WARNING nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] VM already halted, skipping shutdown... 2015-08-07 17:47:23.740 DEBUG nova.virt.xenapi.vmops [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:47:23.749 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI c5793b3b-3c0a-4f8a-bddf-a4e8d2fc281f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:23.759 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 0b7e67ef-6bcc-475b-aade-ac5ee0d57f7c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:24.017 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:24.364 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:47:24.378 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:47:24.378 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:24.631 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Creating disk-type VBD for VM OpaqueRef:fc5f6a77-9240-695b-e26c-384ea046de99, VDI OpaqueRef:723e4b38-0fb9-b054-cec7-1bfe2620179d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:24.640 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Created VBD OpaqueRef:1ebb373f-a649-9261-65a5-96ee7ace3499 for VM OpaqueRef:fc5f6a77-9240-695b-e26c-384ea046de99, VDI OpaqueRef:723e4b38-0fb9-b054-cec7-1bfe2620179d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:24.866 DEBUG nova.virt.xenapi.vmops [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:47:24.882 DEBUG nova.virt.xenapi.vm_utils [req-1301c743-912d-4757-833a-afbd511efac0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:47:25.068 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:25.761 DEBUG nova.compute.manager [req-58925c09-ccc2-4b26-a6e7-ea3df813fec8 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:47:27.962 INFO nova.compute.manager [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Detach volume a01057b3-070e-4f81-8c61-20ad9d422110 from mountpoint /dev/xvdb 2015-08-07 17:47:27.970 DEBUG nova.virt.xenapi.volumeops [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Detach_volume: instance-0000003f, /dev/xvdb detach_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:134 2015-08-07 17:47:27.999 DEBUG oslo_concurrency.lockutils [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:28.179 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "do_unshelve_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:28.573 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:47:28.726 DEBUG nova.network.base_api [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:47:28.770 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:47:28.771 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:28.772 DEBUG nova.compute.resource_tracker [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:47:28.780 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:47:28.781 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total memory: 8187 MB, used: 926.00 MB 2015-08-07 17:47:28.781 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] memory limit: 12280.50 MB, free: 11354.50 MB 2015-08-07 17:47:28.782 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:47:28.782 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] disk limit not specified, defaulting to unlimited 2015-08-07 17:47:28.783 DEBUG nova.objects.instance [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:28.846 DEBUG nova.compute.resources.vcpu [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Total CPUs: 8 VCPUs, used: 6.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:47:28.847 DEBUG nova.compute.resources.vcpu [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:47:28.847 INFO nova.compute.claims [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Claim successful 2015-08-07 17:47:29.132 DEBUG oslo_concurrency.lockutils [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "xenapi-vbd-OpaqueRef:09d9be4d-5d0c-9bac-a66d-c351fd83e376" released by "synchronized_unplug" :: held 1.134s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:29.144 DEBUG nova.virt.xenapi.volume_utils [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Forgetting SR... forget_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:237 2015-08-07 17:47:29.224 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "instance_claim" :: held 0.453s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:29.225 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:47:29.242 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Detected vhd format for image a97cb416-6d80-4939-be35-09f427a551a8 determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:47:29.243 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:29.491 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:47:29.501 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea97cb416-6d80-4939-be35-09f427a551a8" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:29.506 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Asking xapi to fetch vhd image a97cb416-6d80-4939-be35-09f427a551a8 _fetch_vhd_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1425 2015-08-07 17:47:29.527 DEBUG nova.virt.xenapi.client.session [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] glance.download_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:47:30.312 INFO nova.virt.xenapi.volumeops [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Mountpoint /dev/xvdb detached from instance instance-0000003f 2015-08-07 17:47:30.314 DEBUG keystoneclient.session [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}4e8cb0df11d791ca7f3b97b007f8e2c8eed87494" -d '{"os-terminate_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:47:30.659 DEBUG keystoneclient.session [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] RESP: [202] date: Fri, 07 Aug 2015 17:47:30 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-c3b013d9-36ea-442a-b7ba-2e0db513a051 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:47:30.824 DEBUG keystoneclient.session [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/540c7ca9d6cc474f9ee1a73634b4d224/volumes/a01057b3-070e-4f81-8c61-20ad9d422110/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}4e8cb0df11d791ca7f3b97b007f8e2c8eed87494" -d '{"os-detach": {"attachment_id": null}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:47:32.786 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Created VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:47:32.797 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:32.814 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Created VBD OpaqueRef:0d47e062-5f7e-0126-12ee-5e91a3181928 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:32.815 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Plugging VBD OpaqueRef:0d47e062-5f7e-0126-12ee-5e91a3181928 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:47:32.816 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:33.459 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:33.459 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:34.363 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.904s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:34.363 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Fetched VDIs of type 'root' with UUID '88050801-2d71-415c-83d9-dd9349e75f0c' _fetch_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1361 2015-08-07 17:47:35.268 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:36.131 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.315s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:36.132 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Plugging VBD OpaqueRef:0d47e062-5f7e-0126-12ee-5e91a3181928 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:47:36.137 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] VBD OpaqueRef:0d47e062-5f7e-0126-12ee-5e91a3181928 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:47:36.146 DEBUG keystoneclient.session [req-b36cc91d-24c1-4d83-a743-d4ada9c26ad1 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] RESP: [202] date: Fri, 07 Aug 2015 17:47:36 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-7105ddde-97fe-44d2-87c4-dffaa8d94e45 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:47:36.255 WARNING nova.virt.configdrive [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:47:36.256 DEBUG nova.objects.instance [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lazy-loading `ec2_ids' on Instance uuid 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:36.305 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Running cmd (subprocess): genisoimage -o /tmp/tmpnrEh_N/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpyrSWZj execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:36.480 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] CMD "genisoimage -o /tmp/tmpnrEh_N/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpyrSWZj" returned: 0 in 0.175s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:36.489 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpnrEh_N/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:38.500 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Cloned VDI OpaqueRef:f7b531a6-47a2-d772-0c7c-fd0d98af82b9 from VDI OpaqueRef:fc6d75cb-4d89-e7b0-ff72-bdd493e551df _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:47:39.335 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:39.336 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:39.336 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:39.340 INFO nova.compute.manager [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Terminating instance 2015-08-07 17:47:39.343 INFO nova.virt.xenapi.vmops [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Destroying VM 2015-08-07 17:47:39.359 DEBUG nova.virt.xenapi.vm_utils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:47:39.788 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-image-cachea97cb416-6d80-4939-be35-09f427a551a8" released by "_create_cached_image_impl" :: held 10.286s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:39.788 INFO nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Image creation data, cacheable: True, downloaded: True duration: 10.30 secs for image a97cb416-6d80-4939-be35-09f427a551a8 2015-08-07 17:47:40.040 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" acquired by "do_terminate_instance" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:40.041 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:40.041 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:40.044 INFO nova.compute.manager [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Terminating instance 2015-08-07 17:47:40.046 INFO nova.virt.xenapi.vmops [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Destroying VM 2015-08-07 17:47:41.028 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:41.478 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:41.804 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:47:41.823 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:47:41.824 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:42.100 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:bc55cbc9-3a30-12f0-a50a-cd4a1bb7f97b, VDI OpaqueRef:f7b531a6-47a2-d772-0c7c-fd0d98af82b9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:42.111 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:3c417bc1-35e3-245a-5c8a-75d937d45396 for VM OpaqueRef:bc55cbc9-3a30-12f0-a50a-cd4a1bb7f97b, VDI OpaqueRef:f7b531a6-47a2-d772-0c7c-fd0d98af82b9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:42.875 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:47:42.879 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:42.892 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:047c2c46-3c86-73fe-3586-21dc2a1b6377 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:42.893 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:047c2c46-3c86-73fe-3586-21dc2a1b6377 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:47:42.894 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:43.684 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:43.686 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.83 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:45.321 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.71 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:45.354 DEBUG nova.virt.xenapi.vmops [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:47:45.364 DEBUG nova.virt.xenapi.vm_utils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 45f0444b-41c6-401c-93d6-0e9b8dcc5f76 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:45.373 DEBUG nova.virt.xenapi.vm_utils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI a03f441c-957b-4bed-a270-b598855813f2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:45.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:45.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:46.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:46.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:46.815 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.921s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:46.815 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Plugging VBD OpaqueRef:047c2c46-3c86-73fe-3586-21dc2a1b6377 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:47:46.819 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VBD OpaqueRef:047c2c46-3c86-73fe-3586-21dc2a1b6377 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:47:46.929 WARNING nova.virt.configdrive [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:47:46.930 DEBUG nova.objects.instance [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `ec2_ids' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:47.048 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): genisoimage -o /tmp/tmpJ4JE_e/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpciVp3t execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:47.179 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "genisoimage -o /tmp/tmpJ4JE_e/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpciVp3t" returned: 0 in 0.131s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:47.186 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJ4JE_e/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:47.281 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpnrEh_N/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.791s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:47.285 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:47.421 DEBUG nova.virt.xenapi.vmops [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:47:47.439 DEBUG nova.virt.xenapi.vm_utils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:47:47.441 DEBUG nova.compute.manager [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:47:48.141 DEBUG oslo_concurrency.processutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.856s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:48.143 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Destroying VBD for VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:47:48.144 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:48.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:48.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:47:48.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:47:48.614 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:47:48.614 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 17:47:48.615 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:47:48.616 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:47:48.616 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:49.035 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.6'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:5a:40:04', 'active': False, 'type': u'bridge', 'id': u'05bc5db0-7a1d-4a7a-afed-5f08489f238b', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:47:49.068 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-69bd2eeb-96e0-42ba-9643-c7c085279a18" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:47:49.069 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:47:49.070 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:50.321 DEBUG nova.compute.manager [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:44:24Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=64,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=b197c990-eecd-403b-b8d7-9e57e7053a16,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:44:26Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:47:50.720 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:50.722 DEBUG nova.objects.instance [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `numa_topology' on Instance uuid b197c990-eecd-403b-b8d7-9e57e7053a16 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:51.115 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "update_usage" :: held 0.394s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.321 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.178s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.328 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Destroying VBD for VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:47:51.329 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Creating disk-type VBD for VM OpaqueRef:fc5f6a77-9240-695b-e26c-384ea046de99, VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:51.337 DEBUG nova.virt.xenapi.vm_utils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Created VBD OpaqueRef:1c7c813c-ae92-7215-465c-33afc0cdf033 for VM OpaqueRef:fc5f6a77-9240-695b-e26c-384ea046de99, VDI OpaqueRef:3705596a-01ed-6e1c-583c-b8ebbdef9b2a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:51.339 DEBUG nova.objects.instance [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lazy-loading `pci_devices' on Instance uuid 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:51.464 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:51.580 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 13edb23e-e06e-4bba-9ec5-747d004b90e0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:51.593 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI c647e329-34e7-4219-95ba-606f0977ad19 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:51.606 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 8bbe08a0-c32e-40eb-a34d-97ec9902b7af is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:51.821 DEBUG oslo_concurrency.lockutils [req-46069aba-b19a-4d28-a2fb-27a7c2044588 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b197c990-eecd-403b-b8d7-9e57e7053a16" released by "do_terminate_instance" :: held 12.486s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.946 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:51.947 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.948 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:51.959 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.960 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Injecting hostname (tempest.common.compute-instance-2125870706) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:47:51.960 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:51.973 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_hostname" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:51.973 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:47:51.974 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:52.071 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:52.071 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:47:52.072 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.44 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:52.201 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_nwinfo" :: held 0.227s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:52.202 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:52.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:52.626 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:52.771 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:47:52.783 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:47:52.792 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Created VIF OpaqueRef:3c5aa89e-e9ad-8646-532b-4b59356cd28e, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:47:52.793 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:47:53.276 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:47:53.773 WARNING nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] VM already halted, skipping shutdown... 2015-08-07 17:47:53.784 DEBUG nova.virt.xenapi.vmops [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:47:53.802 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI 8bbe08a0-c32e-40eb-a34d-97ec9902b7af is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:53.815 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] VDI faa9d75a-48ed-4f83-8972-03996d7ddbc7 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:47:54.634 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:54.635 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:55.284 DEBUG nova.virt.xenapi.vmops [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:47:55.299 DEBUG nova.virt.xenapi.vm_utils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:47:55.300 DEBUG nova.compute.manager [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:47:55.438 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:55.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:55.561 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:47:55.562 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:47:55.858 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:55.859 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:47:57.020 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.162s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:57.381 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:47:57.382 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:47:57.382 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:47:57.383 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:57.750 DEBUG nova.compute.manager [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:44:27Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=65,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=b4eac7df-4935-4b77-8307-8c8cabe2c038,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:44:29Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:47:57.780 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:47:57.781 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=857MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:47:57.886 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:47:57.887 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.504s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:57.887 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:47:57.888 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:47:57.982 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpJ4JE_e/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 10.797s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:57.984 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:47:58.082 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:58.086 DEBUG nova.objects.instance [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lazy-loading `numa_topology' on Instance uuid b4eac7df-4935-4b77-8307-8c8cabe2c038 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:58.229 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "compute_resources" released by "update_usage" :: held 0.147s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:58.469 DEBUG oslo_concurrency.processutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.484s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:47:58.470 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:47:58.471 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:47:58.671 DEBUG oslo_concurrency.lockutils [req-459b67b9-6d76-4851-a7c7-1f722050d4b4 tempest-ServerRescueNegativeTestJSON-1228315677 tempest-ServerRescueNegativeTestJSON-2125869541] Lock "b4eac7df-4935-4b77-8307-8c8cabe2c038" released by "do_terminate_instance" :: held 18.631s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:59.431 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.960s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:47:59.439 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Destroying VBD for VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:47:59.440 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Creating disk-type VBD for VM OpaqueRef:bc55cbc9-3a30-12f0-a50a-cd4a1bb7f97b, VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:47:59.450 DEBUG nova.virt.xenapi.vm_utils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Created VBD OpaqueRef:20306013-8241-2ae4-8a2d-642bbbd64093 for VM OpaqueRef:bc55cbc9-3a30-12f0-a50a-cd4a1bb7f97b, VDI OpaqueRef:e5e73cda-4cc8-de53-acc5-4f0b82f00e03. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:47:59.451 DEBUG nova.objects.instance [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `pci_devices' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:47:59.583 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:00.078 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:00.079 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:00.079 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:00.088 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:00.088 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Injecting hostname (tempest.common.compute-instance-1517593106) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:48:00.089 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:00.097 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:00.098 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:48:00.099 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:00.278 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_nwinfo" :: held 0.179s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:00.279 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:00.548 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:48:00.557 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:48:00.567 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Created VIF OpaqueRef:01517782-b6ec-c47e-4bd9-c050c6dcb7cd, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:48:00.568 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:00.796 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:48:02.161 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:48:02.182 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:02.424 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:48:02.426 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:48:02.427 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:02.432 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:02.433 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:02.675 DEBUG nova.virt.xenapi.vmops [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:02.902 DEBUG nova.compute.manager [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:03.241 DEBUG oslo_concurrency.lockutils [req-4019a4bf-f3e2-41c1-b750-dfc670de7a86 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "_locked_do_build_and_run_instance" :: held 46.427s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:05.078 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:06.710 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:06.843 INFO nova.compute.manager [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Starting instance... 2015-08-07 17:48:07.042 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:07.042 DEBUG nova.compute.resource_tracker [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:48:07.050 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:48:07.050 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Total memory: 8187 MB, used: 788.00 MB 2015-08-07 17:48:07.050 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] memory limit: 12280.50 MB, free: 11492.50 MB 2015-08-07 17:48:07.051 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:48:07.051 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] disk limit not specified, defaulting to unlimited 2015-08-07 17:48:07.075 DEBUG nova.compute.resources.vcpu [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Total CPUs: 8 VCPUs, used: 4.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:48:07.076 DEBUG nova.compute.resources.vcpu [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:48:07.077 INFO nova.compute.claims [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Claim successful 2015-08-07 17:48:07.121 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:48:07.139 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:07.342 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:48:07.343 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:48:07.344 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:07.349 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "xenstore-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:07.350 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:07.402 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" released by "instance_claim" :: held 0.361s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:07.581 DEBUG nova.virt.xenapi.vmops [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:07.668 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:07.766 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" released by "update_usage" :: held 0.098s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:07.767 DEBUG nova.compute.utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:48:07.771 13318 DEBUG nova.compute.manager [-] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:48:07.773 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:48:07.883 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:07.884 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:08.617 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:48:08.636 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:48:08.636 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:09.064 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:48:09.083 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:09.652 DEBUG nova.compute.manager [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:09.996 DEBUG oslo_concurrency.lockutils [req-20a6d985-70d3-4902-ba0b-b389e6ebc731 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "do_unshelve_instance" :: held 41.816s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:10.635 13318 DEBUG nova.network.base_api [-] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:57:45:34', 'active': False, 'type': u'bridge', 'id': u'df69d282-ed60-4e5f-a4f6-4c3a49a16e1c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:48:10.698 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:48:10.699 13318 DEBUG nova.compute.manager [-] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:57:45:34', 'active': False, 'type': u'bridge', 'id': u'df69d282-ed60-4e5f-a4f6-4c3a49a16e1c', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:48:11.165 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Cloned VDI OpaqueRef:9623a447-68b3-a1a8-d5c2-0cb918353805 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:48:11.354 DEBUG oslo_concurrency.lockutils [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "do_stop_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:11.355 DEBUG nova.compute.manager [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:11.371 DEBUG nova.compute.manager [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /opt/stack/new/nova/nova/compute/manager.py:2404 2015-08-07 17:48:11.379 DEBUG nova.virt.xenapi.vm_utils [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:48:12.051 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.968s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:12.052 INFO nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Image creation data, cacheable: True, downloaded: False duration: 2.99 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:48:12.728 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:12.985 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:13.228 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:48:13.240 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:48:13.241 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:13.450 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:b09a92fe-bff6-956f-5649-e9882b52ddc3, VDI OpaqueRef:9623a447-68b3-a1a8-d5c2-0cb918353805 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:48:13.459 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:1051038c-2e2b-33a8-2ddd-a4c971c8376e for VM OpaqueRef:b09a92fe-bff6-956f-5649-e9882b52ddc3, VDI OpaqueRef:9623a447-68b3-a1a8-d5c2-0cb918353805. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:48:13.726 DEBUG nova.compute.manager [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:13.848 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:48:13.852 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:48:13.866 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:07b6f60c-de7f-7d8b-4aad-9c8b27bb3158 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:48:13.867 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Plugging VBD OpaqueRef:07b6f60c-de7f-7d8b-4aad-9c8b27bb3158 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:48:13.868 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:13.960 DEBUG oslo_concurrency.lockutils [req-ce4b1dde-88cf-42b4-8c80-8b81cc60c679 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "do_stop_instance" :: held 2.606s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:15.078 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:15.091 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.224s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:15.092 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Plugging VBD OpaqueRef:07b6f60c-de7f-7d8b-4aad-9c8b27bb3158 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:48:15.095 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VBD OpaqueRef:07b6f60c-de7f-7d8b-4aad-9c8b27bb3158 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:48:15.175 WARNING nova.virt.configdrive [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:48:15.176 DEBUG nova.objects.instance [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `ec2_ids' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:15.207 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): genisoimage -o /tmp/tmpB_th5A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp9hpTjM execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:48:15.298 DEBUG nova.compute.manager [req-8f533de1-c9d4-4f53-a516-694ba4c1886d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Changing instance metadata according to {u'meta1': [u'+', u'data1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:48:15.301 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "genisoimage -o /tmp/tmpB_th5A/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp9hpTjM" returned: 0 in 0.094s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:48:15.307 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpB_th5A/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:48:15.381 DEBUG oslo_concurrency.lockutils [req-8f533de1-c9d4-4f53-a516-694ba4c1886d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:15.450 DEBUG oslo_concurrency.lockutils [req-4c26dcd0-7bde-4da8-8640-5901bfd1dcef tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:48:15.566 DEBUG oslo_concurrency.lockutils [req-8f533de1-c9d4-4f53-a516-694ba4c1886d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_meta" :: held 0.185s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:15.588 DEBUG nova.network.base_api [req-4c26dcd0-7bde-4da8-8640-5901bfd1dcef tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:48:15.612 DEBUG oslo_concurrency.lockutils [req-4c26dcd0-7bde-4da8-8640-5901bfd1dcef tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:48:15.631 DEBUG nova.compute.manager [req-88f135d2-97ea-49f5-81a1-ee14650b0e5d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Changing instance metadata according to {u'meta1': [u'-']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:48:15.637 DEBUG oslo_concurrency.lockutils [req-88f135d2-97ea-49f5-81a1-ee14650b0e5d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:15.646 DEBUG nova.virt.xenapi.vmops [req-4c26dcd0-7bde-4da8-8640-5901bfd1dcef tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:48:15.813 DEBUG oslo_concurrency.lockutils [req-88f135d2-97ea-49f5-81a1-ee14650b0e5d tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_meta" :: held 0.176s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:18.011 DEBUG nova.compute.manager [req-a39bd7ae-116d-4e10-bc81-16a9a5a73bd8 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Changing instance metadata according to {u'meta1': [u'+', u'data1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:48:18.015 DEBUG oslo_concurrency.lockutils [req-a39bd7ae-116d-4e10-bc81-16a9a5a73bd8 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:18.180 DEBUG oslo_concurrency.lockutils [req-a39bd7ae-116d-4e10-bc81-16a9a5a73bd8 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_meta" :: held 0.165s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:18.475 DEBUG nova.compute.manager [req-48401c48-b475-49a8-83c5-d0dfc25829f5 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Changing instance metadata according to {u'meta1': [u'-']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:48:18.480 DEBUG oslo_concurrency.lockutils [req-48401c48-b475-49a8-83c5-d0dfc25829f5 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "update_meta" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:18.669 DEBUG oslo_concurrency.lockutils [req-48401c48-b475-49a8-83c5-d0dfc25829f5 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "xenstore-8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "update_meta" :: held 0.188s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:18.749 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpB_th5A/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.442s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:48:18.751 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:48:19.096 DEBUG oslo_concurrency.processutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.344s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:48:19.100 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Destroying VBD for VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:48:19.101 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:19.781 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.681s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:19.789 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Destroying VBD for VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:48:19.790 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:b09a92fe-bff6-956f-5649-e9882b52ddc3, VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:48:19.802 DEBUG nova.virt.xenapi.vm_utils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:3954f535-2e5e-d402-0b0f-433bcee5dcb1 for VM OpaqueRef:b09a92fe-bff6-956f-5649-e9882b52ddc3, VDI OpaqueRef:7bf39ecf-e46a-c7ab-b95b-1d7d7163c80e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:48:19.803 DEBUG nova.objects.instance [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `pci_devices' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:19.895 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:20.135 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "store_meta" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:20.136 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:20.145 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:20.153 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:20.154 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Injecting hostname (tempest.common.compute-instance-1822861565) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:48:20.154 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:20.161 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:20.162 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:48:20.162 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:20.317 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_nwinfo" :: held 0.155s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:20.318 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:20.483 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:48:20.490 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:48:20.496 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Created VIF OpaqueRef:5187dacc-2c80-738c-55ca-f43059bdd80c, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:48:20.497 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:20.650 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:48:21.178 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:21.181 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:21.181 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:21.184 INFO nova.compute.manager [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Terminating instance 2015-08-07 17:48:21.188 INFO nova.virt.xenapi.vmops [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Destroying VM 2015-08-07 17:48:21.199 DEBUG nova.virt.xenapi.vm_utils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:48:22.097 DEBUG nova.compute.manager [req-4c26dcd0-7bde-4da8-8640-5901bfd1dcef tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:23.280 DEBUG nova.virt.xenapi.vmops [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:48:23.293 DEBUG nova.virt.xenapi.vm_utils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] VDI 7bb45ee7-b82d-4a96-a263-daab9d9ef8a0 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:23.302 DEBUG nova.virt.xenapi.vm_utils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] VDI c78f5327-54ee-492d-9a9d-cb9c3074bd04 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:25.069 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:26.589 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:48:26.606 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:26.780 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:48:26.781 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:48:26.781 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:26.786 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:26.786 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:27.145 DEBUG nova.virt.xenapi.vmops [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:27.553 DEBUG nova.compute.manager [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:27.796 DEBUG oslo_concurrency.lockutils [req-eb9cd386-f2dd-452b-8700-fcb0f6df412c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "_locked_do_build_and_run_instance" :: held 21.086s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:28.312 DEBUG nova.compute.manager [req-1b6344bc-4140-4e39-982a-013d4ac54b2b tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:29.344 INFO nova.compute.manager [req-13b1912f-f3a8-46b9-bb2f-ec2bb394bcc0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Resuming 2015-08-07 17:48:29.348 DEBUG oslo_concurrency.lockutils [req-13b1912f-f3a8-46b9-bb2f-ec2bb394bcc0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Acquired semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:48:29.612 DEBUG nova.network.base_api [req-13b1912f-f3a8-46b9-bb2f-ec2bb394bcc0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:ba:d2:8b', 'active': False, 'type': u'bridge', 'id': u'fba2136d-b403-4996-9701-9a9aeef179dd', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:48:29.635 DEBUG oslo_concurrency.lockutils [req-13b1912f-f3a8-46b9-bb2f-ec2bb394bcc0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Releasing semaphore "refresh_cache-3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:48:34.594 DEBUG nova.virt.xenapi.vmops [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:48:34.606 DEBUG nova.virt.xenapi.vm_utils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:48:34.607 DEBUG nova.compute.manager [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:48:35.077 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:35.882 DEBUG nova.compute.manager [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:47:16Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=73,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=8709c80f-ab24-4aa9-ad2c-64ea4c51ff51,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:47:18Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:48:36.026 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:36.028 DEBUG nova.objects.instance [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lazy-loading `numa_topology' on Instance uuid 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:36.097 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "compute_resources" released by "update_usage" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:36.310 DEBUG oslo_concurrency.lockutils [req-af9e6ee3-672e-450e-8182-e0ae47b07441 tempest-AuthorizationTestJSON-667947607 tempest-AuthorizationTestJSON-2076087517] Lock "8709c80f-ab24-4aa9-ad2c-64ea4c51ff51" released by "do_terminate_instance" :: held 15.132s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:41.280 INFO nova.compute.manager [req-91450dc0-789e-40ab-b380-b77df073dc2b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Pausing 2015-08-07 17:48:41.331 DEBUG nova.compute.manager [req-91450dc0-789e-40ab-b380-b77df073dc2b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:42.951 INFO nova.compute.manager [req-116fc956-2e50-429c-bec0-356240244e13 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Unpausing 2015-08-07 17:48:43.068 DEBUG nova.compute.manager [req-116fc956-2e50-429c-bec0-356240244e13 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:44.201 DEBUG nova.compute.manager [req-13b1912f-f3a8-46b9-bb2f-ec2bb394bcc0 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:45.083 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:45.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:45.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:46.009 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.011 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.011 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:46.012 INFO nova.compute.manager [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Terminating instance 2015-08-07 17:48:46.013 INFO nova.virt.xenapi.vmops [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Destroying VM 2015-08-07 17:48:46.019 WARNING nova.virt.xenapi.vm_utils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] VM already halted, skipping shutdown... 2015-08-07 17:48:46.025 DEBUG nova.virt.xenapi.vmops [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:48:46.030 DEBUG nova.virt.xenapi.vm_utils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI f6006734-242b-4374-b2ac-eac20be70bf3 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:46.036 DEBUG nova.virt.xenapi.vm_utils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 71ae5cc1-f713-41b5-8a9a-95bd0dae9ba6 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:46.293 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.294 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.294 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:46.295 INFO nova.compute.manager [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Terminating instance 2015-08-07 17:48:46.296 INFO nova.virt.xenapi.vmops [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Destroying VM 2015-08-07 17:48:46.302 DEBUG nova.virt.xenapi.vm_utils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:48:46.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:46.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:46.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:46.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:46.566 DEBUG nova.virt.xenapi.vmops [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:48:46.577 DEBUG nova.virt.xenapi.vm_utils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:48:46.578 DEBUG nova.compute.manager [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:48:46.685 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.685 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:46.685 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:46.687 INFO nova.compute.manager [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Terminating instance 2015-08-07 17:48:46.688 INFO nova.virt.xenapi.vmops [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VM 2015-08-07 17:48:46.694 DEBUG nova.virt.xenapi.vm_utils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:48:47.053 DEBUG oslo_concurrency.lockutils [req-c5b129d8-6bbf-47a9-a16c-fb51fb71bcc2 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "d184bcb6-3324-4ae5-8544-52067dce2444" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:47.088 INFO nova.compute.manager [req-c5b129d8-6bbf-47a9-a16c-fb51fb71bcc2 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: d184bcb6-3324-4ae5-8544-52067dce2444] Starting instance... 2015-08-07 17:48:47.244 DEBUG nova.compute.manager [req-c5b129d8-6bbf-47a9-a16c-fb51fb71bcc2 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: d184bcb6-3324-4ae5-8544-52067dce2444] Unexpected task state: expecting [u'scheduling', None] but the actual state is deleting _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1871 2015-08-07 17:48:47.284 DEBUG oslo_concurrency.lockutils [req-c5b129d8-6bbf-47a9-a16c-fb51fb71bcc2 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "d184bcb6-3324-4ae5-8544-52067dce2444" released by "_locked_do_build_and_run_instance" :: held 0.231s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:47.965 DEBUG nova.compute.manager [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:39:50Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=53,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=69bd2eeb-96e0-42ba-9643-c7c085279a18,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:39:52Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:48:48.091 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:48.092 DEBUG nova.objects.instance [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid 69bd2eeb-96e0-42ba-9643-c7c085279a18 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:48.162 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:48.232 DEBUG nova.virt.xenapi.vmops [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:48:48.246 DEBUG nova.virt.xenapi.vm_utils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 60495381-4db4-4b06-83c7-7b5f778290c3 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:48.255 DEBUG nova.virt.xenapi.vm_utils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 8f4b9096-aabb-411b-8fda-a208c74e5a95 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:48.387 DEBUG oslo_concurrency.lockutils [req-fd2b83c1-a551-49b7-8925-d9bc60292940 tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "69bd2eeb-96e0-42ba-9643-c7c085279a18" released by "do_terminate_instance" :: held 2.377s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:48.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:48.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:48:48.626 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5364 2015-08-07 17:48:48.706 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5364 2015-08-07 17:48:48.707 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:48:48.707 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:48.814 DEBUG nova.virt.xenapi.vmops [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:48:48.825 DEBUG nova.virt.xenapi.vm_utils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:48:48.825 DEBUG nova.compute.manager [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:48:48.899 DEBUG nova.virt.xenapi.vmops [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:48:48.906 DEBUG nova.virt.xenapi.vm_utils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 14eea257-22ad-4a41-99b6-527478579377 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:48.916 DEBUG nova.virt.xenapi.vm_utils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] VDI 5dccc3ca-59a4-42bd-8c41-3ea04967b6a1 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:49.405 DEBUG nova.virt.xenapi.vmops [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:48:49.415 DEBUG nova.virt.xenapi.vm_utils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:48:49.416 DEBUG nova.compute.manager [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:48:49.987 DEBUG nova.compute.manager [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:40:31Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=54,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=af2ef72d-4895-4de0-bd40-aaa2ac498091,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:40:33Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:48:50.154 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:50.155 DEBUG nova.objects.instance [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid af2ef72d-4895-4de0-bd40-aaa2ac498091 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:50.232 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.078s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:50.459 DEBUG oslo_concurrency.lockutils [req-0f8b5138-101d-46dd-8e09-8b601ad2ca5a tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "af2ef72d-4895-4de0-bd40-aaa2ac498091" released by "do_terminate_instance" :: held 4.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:50.725 DEBUG nova.compute.manager [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:45:38Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=69,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=3d063a71-d4d9-49d1-ae99-79d2dc71c6d3,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:45:39Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:48:50.895 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:50.896 DEBUG nova.objects.instance [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lazy-loading `numa_topology' on Instance uuid 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:48:50.963 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "compute_resources" released by "update_usage" :: held 0.067s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:51.153 DEBUG oslo_concurrency.lockutils [req-bfa2d08c-4b38-4cbb-90f2-b01de06b4aec tempest-ServerActionsTestJSON-2080221373 tempest-ServerActionsTestJSON-2017217826] Lock "3d063a71-d4d9-49d1-ae99-79d2dc71c6d3" released by "do_terminate_instance" :: held 4.469s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:52.036 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:48:52.707 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:52.708 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:48:52.708 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:53.580 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 17:48:53.600 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:53.608 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:48:53.906 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.306s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:53.912 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD 71d5ce64-e2df-4f93-90c3-24a236638912 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:53.929 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD 71d5ce64-e2df-4f93-90c3-24a236638912 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:53.938 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD 88050801-2d71-415c-83d9-dd9349e75f0c has parent eef80aa6-f16e-4dc7-9b05-68cae1b6f304 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:53.945 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:53.954 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 17:48:54.841 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 17:48:54.846 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:54.848 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:48:55.083 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:55.171 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:55.188 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.342s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:55.194 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD 62126dde-4191-43e4-9de0-3ed802d37a61 has parent e9146aeb-79a2-402a-b874-b50ecca0851f _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:55.207 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VHD e9146aeb-79a2-402a-b874-b50ecca0851f has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 17:48:55.211 INFO nova.compute.manager [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Starting instance... 2015-08-07 17:48:55.354 DEBUG nova.virt.xenapi.client.session [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] glance.upload_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:48:55.386 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:55.389 DEBUG nova.compute.resource_tracker [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:48:55.395 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:48:55.396 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:48:55.397 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:48:55.397 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:48:55.398 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] disk limit not specified, defaulting to unlimited 2015-08-07 17:48:55.412 DEBUG nova.compute.resources.vcpu [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:48:55.413 DEBUG nova.compute.resources.vcpu [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:48:55.413 INFO nova.compute.claims [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Claim successful 2015-08-07 17:48:55.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:55.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:55.720 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" released by "instance_claim" :: held 0.335s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:55.971 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:56.049 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" released by "update_usage" :: held 0.078s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:56.050 DEBUG nova.compute.utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:48:56.054 13318 DEBUG nova.compute.manager [-] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:48:56.055 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-e0cc4887-0c12-4012-a49e-55fdf90fda04" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:48:56.443 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:48:56.456 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:48:56.457 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:56.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:56.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:56.660 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:48:56.667 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:57.448 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Cloned VDI OpaqueRef:3562d90c-e28f-eaf3-33f7-9db794803682 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:48:57.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:48:57.554 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:48:57.554 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:48:57.754 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:57.754 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:48:57.792 13318 DEBUG nova.network.base_api [-] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:cd:bf:c5', 'active': False, 'type': u'bridge', 'id': u'9ed8adcf-1224-4956-9db3-a179702fb0df', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:48:57.820 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-e0cc4887-0c12-4012-a49e-55fdf90fda04" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:48:57.821 13318 DEBUG nova.compute.manager [-] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:cd:bf:c5', 'active': False, 'type': u'bridge', 'id': u'9ed8adcf-1224-4956-9db3-a179702fb0df', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:48:57.998 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.331s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:57.999 INFO nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Image creation data, cacheable: True, downloaded: False duration: 1.34 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:48:58.192 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.438s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:58.366 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:48:58.367 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:48:58.367 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:48:58.368 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:48:58.544 DEBUG nova.virt.xenapi.vmops [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Finished snapshot and upload for VM, duration: 4.96 secs for image 7d0cecc5-2bd7-4c90-9b67-c82bed263a09 snapshot /opt/stack/new/nova/nova/virt/xenapi/vmops.py:894 2015-08-07 17:48:58.545 DEBUG nova.compute.manager [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:58.559 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:48:58.559 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:48:58.610 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:48:58.611 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.243s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:48:58.611 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:48:58.649 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:58.768 WARNING nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] VM already halted, skipping shutdown... 2015-08-07 17:48:58.769 DEBUG nova.compute.manager [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:48:58.815 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:58.849 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Acquired semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:48:58.953 DEBUG nova.network.base_api [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:57:45:34', 'active': False, 'type': u'bridge', 'id': u'df69d282-ed60-4e5f-a4f6-4c3a49a16e1c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:48:58.979 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:48:58.986 DEBUG oslo_concurrency.lockutils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Releasing semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:48:58.988 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:48:58.988 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:48:59.015 INFO nova.virt.xenapi.vmops [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Destroying VM 2015-08-07 17:48:59.024 WARNING nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] VM already halted, skipping shutdown... 2015-08-07 17:48:59.036 DEBUG nova.virt.xenapi.vmops [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:48:59.041 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VDI 71d5ce64-e2df-4f93-90c3-24a236638912 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:59.046 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VDI e05240ed-1164-4419-8b6f-d3a11076f710 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:48:59.301 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Creating disk-type VBD for VM OpaqueRef:018321c0-77a0-2bd6-9235-a40064b3da61, VDI OpaqueRef:3562d90c-e28f-eaf3-33f7-9db794803682 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:48:59.312 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Created VBD OpaqueRef:cecc6210-02ae-5f2a-aafe-3be38066fb3a for VM OpaqueRef:018321c0-77a0-2bd6-9235-a40064b3da61, VDI OpaqueRef:3562d90c-e28f-eaf3-33f7-9db794803682. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:48:59.580 DEBUG nova.virt.xenapi.vmops [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:48:59.592 DEBUG nova.virt.xenapi.vm_utils [req-b86f3dd0-f097-4c00-81e2-5226aae71c90 tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:49:00.833 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "do_unshelve_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:01.077 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Acquired semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:49:01.179 DEBUG nova.network.base_api [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:57:45:34', 'active': False, 'type': u'bridge', 'id': u'df69d282-ed60-4e5f-a4f6-4c3a49a16e1c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:49:01.204 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Releasing semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:49:01.205 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:01.205 DEBUG nova.compute.resource_tracker [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:49:01.212 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:49:01.212 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Total memory: 8187 MB, used: 650.00 MB 2015-08-07 17:49:01.213 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] memory limit: 12280.50 MB, free: 11630.50 MB 2015-08-07 17:49:01.213 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:49:01.213 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] disk limit not specified, defaulting to unlimited 2015-08-07 17:49:01.213 DEBUG nova.objects.instance [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `numa_topology' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:01.259 DEBUG nova.compute.resources.vcpu [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:49:01.259 DEBUG nova.compute.resources.vcpu [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:49:01.259 INFO nova.compute.claims [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Claim successful 2015-08-07 17:49:01.515 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" released by "instance_claim" :: held 0.310s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:01.516 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:49:01.527 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Detected vhd format for image 7d0cecc5-2bd7-4c90-9b67-c82bed263a09 determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:49:01.528 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:01.700 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:49:01.749 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-image-cache7d0cecc5-2bd7-4c90-9b67-c82bed263a09" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:01.751 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Asking xapi to fetch vhd image 7d0cecc5-2bd7-4c90-9b67-c82bed263a09 _fetch_vhd_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1425 2015-08-07 17:49:01.768 DEBUG nova.virt.xenapi.client.session [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] glance.download_vhd attempt 1/1, callback_result: 192.168.33.1 call_plugin_serialized_with_retry /opt/stack/new/nova/nova/virt/xenapi/client/session.py:245 2015-08-07 17:49:04.006 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:04.006 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:49:04.340 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.334s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:04.340 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Fetched VDIs of type 'root' with UUID 'fee7cc36-2e44-486b-ba4a-d8bb59b182ea' _fetch_image /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1361 2015-08-07 17:49:05.073 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:05.602 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Cloned VDI OpaqueRef:e98d24da-5aa2-4c22-ff91-87fef85d3419 from VDI OpaqueRef:89573797-f625-0f89-a3f1-a7a43bff3379 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:49:05.952 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Created VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:49:05.955 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:05.965 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Created VBD OpaqueRef:ce6b3da2-23ed-2f72-b933-e663c85ffa3f for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:05.966 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Plugging VBD OpaqueRef:ce6b3da2-23ed-2f72-b933-e663c85ffa3f ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:49:05.967 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:06.106 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-image-cache7d0cecc5-2bd7-4c90-9b67-c82bed263a09" released by "_create_cached_image_impl" :: held 4.357s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:06.106 INFO nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Image creation data, cacheable: True, downloaded: True duration: 4.41 secs for image 7d0cecc5-2bd7-4c90-9b67-c82bed263a09 2015-08-07 17:49:06.641 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:06.869 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:06.945 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 0.978s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:06.945 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Plugging VBD OpaqueRef:ce6b3da2-23ed-2f72-b933-e663c85ffa3f done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:49:06.948 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] VBD OpaqueRef:ce6b3da2-23ed-2f72-b933-e663c85ffa3f plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:49:07.011 WARNING nova.virt.configdrive [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:49:07.012 DEBUG nova.objects.instance [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lazy-loading `ec2_ids' on Instance uuid e0cc4887-0c12-4012-a49e-55fdf90fda04 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:07.045 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Running cmd (subprocess): genisoimage -o /tmp/tmpSsIyjd/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp_nRppz execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:07.126 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:49:07.130 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] CMD "genisoimage -o /tmp/tmpSsIyjd/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmp_nRppz" returned: 0 in 0.085s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:07.132 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSsIyjd/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:07.193 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:49:07.195 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:07.632 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:ba8430e3-8980-ea19-f213-bfdf1f24e457, VDI OpaqueRef:e98d24da-5aa2-4c22-ff91-87fef85d3419 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:07.639 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:dae9f833-5184-5ccc-917c-f0cd24700a96 for VM OpaqueRef:ba8430e3-8980-ea19-f213-bfdf1f24e457, VDI OpaqueRef:e98d24da-5aa2-4c22-ff91-87fef85d3419. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:08.434 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:49:08.438 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:08.449 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:925c0e46-6115-3071-70a3-2723fe1a95e9 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:08.450 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Plugging VBD OpaqueRef:925c0e46-6115-3071-70a3-2723fe1a95e9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:49:08.451 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:08.610 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:08.612 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:10.837 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpSsIyjd/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.705s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:10.842 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:11.205 DEBUG oslo_concurrency.processutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.363s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:11.208 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Destroying VBD for VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:49:11.280 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.829s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:11.280 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Plugging VBD OpaqueRef:925c0e46-6115-3071-70a3-2723fe1a95e9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:49:11.282 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.073s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:11.284 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VBD OpaqueRef:925c0e46-6115-3071-70a3-2723fe1a95e9 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:49:11.385 WARNING nova.virt.configdrive [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:49:11.386 DEBUG nova.objects.instance [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `ec2_ids' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:11.451 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): genisoimage -o /tmp/tmp9Sr7Ib/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpD7ANwo execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:11.542 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "genisoimage -o /tmp/tmp9Sr7Ib/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpD7ANwo" returned: 0 in 0.091s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:11.547 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp9Sr7Ib/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:14.199 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 2.917s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:14.208 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Destroying VBD for VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:49:14.209 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Creating disk-type VBD for VM OpaqueRef:018321c0-77a0-2bd6-9235-a40064b3da61, VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:14.215 DEBUG nova.virt.xenapi.vm_utils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Created VBD OpaqueRef:4a021d7d-092d-0cbe-16ca-1a276c3d4068 for VM OpaqueRef:018321c0-77a0-2bd6-9235-a40064b3da61, VDI OpaqueRef:a29ca896-dd99-ee5b-9cf9-1424afbee895. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:14.216 DEBUG nova.objects.instance [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lazy-loading `pci_devices' on Instance uuid e0cc4887-0c12-4012-a49e-55fdf90fda04 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:14.312 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:14.486 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp9Sr7Ib/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 2.939s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:14.487 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:14.563 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:14.565 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "store_meta" :: held 0.003s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:14.566 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:14.576 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:14.577 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Injecting hostname (tempest.common.compute-instance-1489036770) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:49:14.577 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:14.589 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:14.590 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:49:14.590 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:14.751 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "update_nwinfo" :: held 0.160s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:14.751 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:14.766 DEBUG oslo_concurrency.processutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.279s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:14.767 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Destroying VBD for VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:49:14.767 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:14.919 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:49:14.924 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:49:14.930 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Created VIF OpaqueRef:99590bd9-ebc5-8800-b9e0-22f4aaee8075, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:49:14.931 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:15.071 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:15.088 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:49:15.308 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.541s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:15.316 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Destroying VBD for VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:49:15.316 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Creating disk-type VBD for VM OpaqueRef:ba8430e3-8980-ea19-f213-bfdf1f24e457, VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:15.322 DEBUG nova.virt.xenapi.vm_utils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Created VBD OpaqueRef:148989b0-e6cd-a0ea-8646-770e7326d596 for VM OpaqueRef:ba8430e3-8980-ea19-f213-bfdf1f24e457, VDI OpaqueRef:aa113079-239e-682c-1a42-25885c666676. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:15.322 DEBUG nova.objects.instance [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `pci_devices' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:15.402 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:15.530 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:15.531 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:15.532 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:15.540 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:15.541 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Injecting hostname (tempest.common.compute-instance-1822861565) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:49:15.542 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:15.549 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:15.550 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:49:15.550 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:15.687 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_nwinfo" :: held 0.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:15.687 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:15.897 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:49:15.901 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:49:15.905 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Created VIF OpaqueRef:78ea66f5-da91-e75f-fbd3-84cb48b7a889, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:49:15.906 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:16.068 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:49:19.594 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:49:19.609 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:19.756 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:49:19.757 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:49:19.759 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "update_hostname" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:19.763 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "xenstore-e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:19.763 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:19.909 DEBUG nova.virt.xenapi.vmops [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:20.036 DEBUG nova.compute.manager [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:49:20.257 DEBUG oslo_concurrency.lockutils [req-8af9fcde-135d-4574-9022-8e02c6cbac50 tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "_locked_do_build_and_run_instance" :: held 25.086s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:20.361 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:49:20.378 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:20.527 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:49:20.527 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:49:20.527 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:20.530 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "xenstore-6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "update_hostname" :: held 0.003s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:20.531 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:20.695 DEBUG nova.virt.xenapi.vmops [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:20.971 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:20.972 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:20.973 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:20.977 INFO nova.compute.manager [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Terminating instance 2015-08-07 17:49:20.979 INFO nova.virt.xenapi.vmops [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Destroying VM 2015-08-07 17:49:20.987 DEBUG nova.virt.xenapi.vm_utils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:49:21.764 DEBUG nova.compute.manager [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:49:21.969 DEBUG oslo_concurrency.lockutils [req-3bb6d6b8-15e5-4751-b77b-31ebeb2c9ddc tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "do_unshelve_instance" :: held 21.136s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:22.632 DEBUG nova.virt.xenapi.vmops [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:49:22.637 DEBUG nova.virt.xenapi.vm_utils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] VDI c668aa16-90c1-4371-98a3-52748d977bb4 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:22.646 DEBUG nova.virt.xenapi.vm_utils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] VDI 7efe5ce3-89b3-466e-9204-87463a9f433a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:23.159 DEBUG nova.virt.xenapi.vmops [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:49:23.174 DEBUG nova.virt.xenapi.vm_utils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:49:23.174 DEBUG nova.compute.manager [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:49:24.587 DEBUG nova.compute.manager [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:48:54Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=76,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=e0cc4887-0c12-4012-a49e-55fdf90fda04,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:48:56Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:49:24.777 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:24.778 DEBUG nova.objects.instance [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lazy-loading `numa_topology' on Instance uuid e0cc4887-0c12-4012-a49e-55fdf90fda04 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:24.866 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "compute_resources" released by "update_usage" :: held 0.089s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:25.101 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:25.105 DEBUG oslo_concurrency.lockutils [req-498ef86d-d8ae-4020-9d9e-56416c5ff0bd tempest-ServerAddressesNegativeTestJSON-1212877604 tempest-ServerAddressesNegativeTestJSON-1725097141] Lock "e0cc4887-0c12-4012-a49e-55fdf90fda04" released by "do_terminate_instance" :: held 4.134s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:27.683 DEBUG nova.compute.manager [req-8a3e38bc-d840-4e1b-8c2f-9ea6dc9e151c tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:49:29.345 INFO nova.compute.manager [req-9e33f813-93e7-467d-be5a-8f6741e9051d tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Resuming 2015-08-07 17:49:29.346 DEBUG oslo_concurrency.lockutils [req-9e33f813-93e7-467d-be5a-8f6741e9051d tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Acquired semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:49:29.453 DEBUG nova.network.base_api [req-9e33f813-93e7-467d-be5a-8f6741e9051d tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.7'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:57:45:34', 'active': False, 'type': u'bridge', 'id': u'df69d282-ed60-4e5f-a4f6-4c3a49a16e1c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:49:29.478 DEBUG oslo_concurrency.lockutils [req-9e33f813-93e7-467d-be5a-8f6741e9051d tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Releasing semaphore "refresh_cache-6f6a5a93-63ed-4d42-be4c-e81077af680b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:49:33.172 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:33.209 INFO nova.compute.manager [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Starting instance... 2015-08-07 17:49:33.338 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:33.339 DEBUG nova.compute.resource_tracker [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:49:33.344 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:49:33.344 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:49:33.345 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:49:33.345 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:49:33.346 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] disk limit not specified, defaulting to unlimited 2015-08-07 17:49:33.362 DEBUG nova.compute.resources.vcpu [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:49:33.362 DEBUG nova.compute.resources.vcpu [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:49:33.364 INFO nova.compute.claims [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Claim successful 2015-08-07 17:49:33.555 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" released by "instance_claim" :: held 0.217s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:33.684 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:33.846 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" released by "update_usage" :: held 0.163s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:33.847 DEBUG nova.compute.utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:49:33.849 13318 DEBUG nova.compute.manager [-] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:49:33.850 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-5cbf6860-028a-46b8-9f4a-4107956a3570" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:49:34.001 DEBUG nova.compute.manager [req-9e33f813-93e7-467d-be5a-8f6741e9051d tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:49:34.214 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:49:34.228 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:49:34.229 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:34.383 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:49:34.389 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:35.081 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:35.111 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Cloned VDI OpaqueRef:53b7622d-d7c8-f8ae-75c1-d51f8a03b4ba from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:49:35.437 13318 DEBUG nova.network.base_api [-] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:fc:8b:25', 'active': False, 'type': u'bridge', 'id': u'26a5828a-6822-4346-ae25-415b929d73cf', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:49:35.460 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-5cbf6860-028a-46b8-9f4a-4107956a3570" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:49:35.460 13318 DEBUG nova.compute.manager [-] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:fc:8b:25', 'active': False, 'type': u'bridge', 'id': u'26a5828a-6822-4346-ae25-415b929d73cf', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:49:35.606 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.217s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:35.607 INFO nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Image creation data, cacheable: True, downloaded: False duration: 1.22 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:49:36.074 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:36.212 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:36.367 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:49:36.377 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:49:36.378 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:36.517 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Creating disk-type VBD for VM OpaqueRef:ed438e2e-cd88-c4f7-77c4-9f32bf786ec1, VDI OpaqueRef:53b7622d-d7c8-f8ae-75c1-d51f8a03b4ba ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:36.525 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Created VBD OpaqueRef:cffa3a04-8e37-c14d-e67b-5c8cc6d62c8a for VM OpaqueRef:ed438e2e-cd88-c4f7-77c4-9f32bf786ec1, VDI OpaqueRef:53b7622d-d7c8-f8ae-75c1-d51f8a03b4ba. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:36.754 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Created VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:49:36.758 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:36.766 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Created VBD OpaqueRef:19192071-c972-6e74-3fbe-deecac2f2d03 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:36.767 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Plugging VBD OpaqueRef:19192071-c972-6e74-3fbe-deecac2f2d03 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:49:36.767 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:37.655 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 0.888s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:37.656 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Plugging VBD OpaqueRef:19192071-c972-6e74-3fbe-deecac2f2d03 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:49:37.658 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] VBD OpaqueRef:19192071-c972-6e74-3fbe-deecac2f2d03 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:49:37.715 WARNING nova.virt.configdrive [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:49:37.716 DEBUG nova.objects.instance [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lazy-loading `ec2_ids' on Instance uuid 5cbf6860-028a-46b8-9f4a-4107956a3570 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:37.754 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Running cmd (subprocess): genisoimage -o /tmp/tmpcQBHM1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpFNpFX3 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:37.826 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:37.827 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:37.828 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:37.830 INFO nova.compute.manager [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Terminating instance 2015-08-07 17:49:37.832 INFO nova.virt.xenapi.vmops [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Destroying VM 2015-08-07 17:49:37.839 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] CMD "genisoimage -o /tmp/tmpcQBHM1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpFNpFX3" returned: 0 in 0.085s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:37.841 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpcQBHM1/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:37.910 DEBUG nova.virt.xenapi.vm_utils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:49:40.531 DEBUG nova.virt.xenapi.vmops [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:49:40.542 DEBUG nova.virt.xenapi.vm_utils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VDI 35ecb38c-b52a-4647-bc6b-12cc65fa2d25 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:40.550 DEBUG nova.virt.xenapi.vm_utils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] VDI 8f348115-44ad-470b-82c7-5cea993bd84e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:40.725 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpcQBHM1/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 2.884s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:40.729 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:49:41.012 DEBUG oslo_concurrency.processutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.283s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:49:41.014 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Destroying VBD for VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:49:41.015 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:41.205 DEBUG nova.virt.xenapi.vmops [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:49:41.214 DEBUG nova.virt.xenapi.vm_utils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:49:41.215 DEBUG nova.compute.manager [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:49:41.585 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.570s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:41.592 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Destroying VBD for VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:49:41.592 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Creating disk-type VBD for VM OpaqueRef:ed438e2e-cd88-c4f7-77c4-9f32bf786ec1, VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:41.598 DEBUG nova.virt.xenapi.vm_utils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Created VBD OpaqueRef:c0ebe576-a788-d456-bdfd-074822df28fe for VM OpaqueRef:ed438e2e-cd88-c4f7-77c4-9f32bf786ec1, VDI OpaqueRef:79f852e7-aaa8-6321-bcc0-2f312eb28926. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:41.600 DEBUG nova.objects.instance [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lazy-loading `pci_devices' on Instance uuid 5cbf6860-028a-46b8-9f4a-4107956a3570 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:41.704 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:41.882 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:41.882 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:41.883 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:41.890 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "store_auto_disk_config" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:41.890 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Injecting hostname (tempest.common.compute-instance-793392818) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:49:41.891 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:41.895 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:41.896 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:49:41.896 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:42.038 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_nwinfo" :: held 0.142s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:42.040 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:42.203 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:49:42.208 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:49:42.213 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Created VIF OpaqueRef:d7b04d0d-244c-9cde-4c55-113534834c73, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:49:42.214 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:42.329 DEBUG nova.compute.manager [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:48:06Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=74,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=6f6a5a93-63ed-4d42-be4c-e81077af680b,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:48:07Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:49:42.387 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:49:42.469 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:42.471 DEBUG nova.objects.instance [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lazy-loading `numa_topology' on Instance uuid 6f6a5a93-63ed-4d42-be4c-e81077af680b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:42.537 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "compute_resources" released by "update_usage" :: held 0.068s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:42.771 DEBUG oslo_concurrency.lockutils [req-798c7626-3188-4ee2-803d-947e6f662f4b tempest-ServersNegativeTestJSON-907843016 tempest-ServersNegativeTestJSON-430946234] Lock "6f6a5a93-63ed-4d42-be4c-e81077af680b" released by "do_terminate_instance" :: held 4.945s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:45.075 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:46.412 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:49:46.423 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:46.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:46.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:46.603 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:49:46.603 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:49:46.604 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:46.610 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:46.612 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:46.768 DEBUG nova.virt.xenapi.vmops [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:46.895 DEBUG nova.compute.manager [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:49:47.097 DEBUG oslo_concurrency.lockutils [req-74db4cd3-80ca-41cc-b05b-765706632b15 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570" released by "_locked_do_build_and_run_instance" :: held 13.925s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:47.473 DEBUG nova.compute.manager [req-5601f3fb-2382-4b32-8940-e08c04e8e9d8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key2': [u'+', u'value2'], u'key1': [u'+', u'value1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:47.478 DEBUG oslo_concurrency.lockutils [req-5601f3fb-2382-4b32-8940-e08c04e8e9d8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:47.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:47.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:47.760 DEBUG oslo_concurrency.lockutils [req-5601f3fb-2382-4b32-8940-e08c04e8e9d8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.282s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:47.953 DEBUG nova.compute.manager [req-5b270199-c2a7-402e-a3a1-7ce9f0158a75 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key1': [u'-']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:47.957 DEBUG oslo_concurrency.lockutils [req-5b270199-c2a7-402e-a3a1-7ce9f0158a75 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:48.098 DEBUG oslo_concurrency.lockutils [req-5b270199-c2a7-402e-a3a1-7ce9f0158a75 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.140s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:48.296 DEBUG nova.compute.manager [req-dbf9d319-7af0-4002-8784-78c52cded576 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key1': [u'+', u'value1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:48.299 DEBUG oslo_concurrency.lockutils [req-dbf9d319-7af0-4002-8784-78c52cded576 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:48.439 DEBUG oslo_concurrency.lockutils [req-dbf9d319-7af0-4002-8784-78c52cded576 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.140s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:48.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:48.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:48.634 DEBUG nova.compute.manager [req-09f8cd36-d09a-4060-84ba-5cbc98e7aeb6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:48.637 DEBUG oslo_concurrency.lockutils [req-09f8cd36-d09a-4060-84ba-5cbc98e7aeb6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:48.637 DEBUG oslo_concurrency.lockutils [req-09f8cd36-d09a-4060-84ba-5cbc98e7aeb6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:48.937 DEBUG nova.compute.manager [req-ec654b50-b693-495b-8aff-8debcf14020c tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:48.940 DEBUG oslo_concurrency.lockutils [req-ec654b50-b693-495b-8aff-8debcf14020c tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:48.941 DEBUG oslo_concurrency.lockutils [req-ec654b50-b693-495b-8aff-8debcf14020c tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:49.153 DEBUG nova.compute.manager [req-e4550ae5-0c78-4dc5-8c9b-bfaf191e5fa8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key2': [u'-'], u'key1': [u'-'], u'meta3': [u'+', u'data3'], u'meta2': [u'+', u'data2']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:49.156 DEBUG oslo_concurrency.lockutils [req-e4550ae5-0c78-4dc5-8c9b-bfaf191e5fa8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:49.471 DEBUG nova.compute.manager [req-bf83d992-0b92-49a2-9857-37f6296de280 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key2': [u'+', u'value2'], u'meta2': [u'-'], u'meta3': [u'-'], u'key1': [u'+', u'value1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:49.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:49.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:49:49.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:49:49.576 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-5cbf6860-028a-46b8-9f4a-4107956a3570" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:49:49.584 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 5cbf6860-028a-46b8-9f4a-4107956a3570 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:49.685 DEBUG nova.compute.manager [req-15b2aad2-4aa3-468b-9a8e-447ed8e20740 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'nova': [u'+', u'alt']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:49.703 DEBUG oslo_concurrency.lockutils [req-e4550ae5-0c78-4dc5-8c9b-bfaf191e5fa8 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.547s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:49.705 DEBUG oslo_concurrency.lockutils [req-bf83d992-0b92-49a2-9857-37f6296de280 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.230s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:49.830 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:fc:8b:25', 'active': False, 'type': u'bridge', 'id': u'26a5828a-6822-4346-ae25-415b929d73cf', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:49:49.856 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-5cbf6860-028a-46b8-9f4a-4107956a3570" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:49:49.856 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:49:49.857 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:50.051 DEBUG nova.compute.manager [req-7647c71a-4da1-4123-bc8f-e43c09c40790 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'nova': [u'-']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:50.223 DEBUG oslo_concurrency.lockutils [req-bf83d992-0b92-49a2-9857-37f6296de280 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.518s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:50.223 DEBUG oslo_concurrency.lockutils [req-15b2aad2-4aa3-468b-9a8e-447ed8e20740 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.528s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:50.260 DEBUG nova.compute.manager [req-1d431c14-15ec-45bd-aed3-512e7ac74c3b tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:50.347 DEBUG oslo_concurrency.lockutils [req-15b2aad2-4aa3-468b-9a8e-447ed8e20740 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.124s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:50.348 DEBUG oslo_concurrency.lockutils [req-7647c71a-4da1-4123-bc8f-e43c09c40790 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.294s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:50.473 DEBUG oslo_concurrency.lockutils [req-7647c71a-4da1-4123-bc8f-e43c09c40790 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.125s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:50.474 DEBUG oslo_concurrency.lockutils [req-1d431c14-15ec-45bd-aed3-512e7ac74c3b tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.211s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:50.475 DEBUG oslo_concurrency.lockutils [req-1d431c14-15ec-45bd-aed3-512e7ac74c3b tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:50.563 DEBUG nova.compute.manager [req-7a3b65bd-5452-4e62-8752-9ee620d4ee73 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:50.566 DEBUG oslo_concurrency.lockutils [req-7a3b65bd-5452-4e62-8752-9ee620d4ee73 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:50.566 DEBUG oslo_concurrency.lockutils [req-7a3b65bd-5452-4e62-8752-9ee620d4ee73 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:50.780 DEBUG nova.compute.manager [req-9d56d140-314c-458f-81c1-23958e0ecfa7 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Changing instance metadata according to {u'key3': [u'+', u'value3'], u'key1': [u'+', u'alt1']} change_instance_metadata /opt/stack/new/nova/nova/compute/manager.py:3203 2015-08-07 17:49:50.786 DEBUG oslo_concurrency.lockutils [req-9d56d140-314c-458f-81c1-23958e0ecfa7 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "update_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:51.053 DEBUG oslo_concurrency.lockutils [req-9d56d140-314c-458f-81c1-23958e0ecfa7 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "xenstore-5cbf6860-028a-46b8-9f4a-4107956a3570" released by "update_meta" :: held 0.266s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:51.471 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:51.472 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:51.472 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:51.473 INFO nova.compute.manager [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Terminating instance 2015-08-07 17:49:51.475 INFO nova.virt.xenapi.vmops [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Destroying VM 2015-08-07 17:49:51.481 DEBUG nova.virt.xenapi.vm_utils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:49:52.856 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:52.857 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:49:52.862 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:52.984 DEBUG nova.virt.xenapi.vmops [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:49:52.992 DEBUG nova.virt.xenapi.vm_utils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] VDI 4cce846b-bffe-4e58-acf9-425a375caa1a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:53.002 DEBUG nova.virt.xenapi.vm_utils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] VDI e536c5e0-2788-409f-a344-5e3d629801a5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:49:53.558 DEBUG nova.virt.xenapi.vmops [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:49:53.568 DEBUG nova.virt.xenapi.vm_utils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:49:53.569 DEBUG nova.compute.manager [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:49:54.829 DEBUG nova.compute.manager [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:49:32Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=77,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=5cbf6860-028a-46b8-9f4a-4107956a3570,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:49:33Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:49:55.005 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:55.005 DEBUG nova.objects.instance [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lazy-loading `numa_topology' on Instance uuid 5cbf6860-028a-46b8-9f4a-4107956a3570 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:49:55.083 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:55.095 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "compute_resources" released by "update_usage" :: held 0.090s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:55.352 DEBUG oslo_concurrency.lockutils [req-5d8520fe-2525-4530-b41a-73a3b6a641f6 tempest-ServerMetadataTestJSON-2094219298 tempest-ServerMetadataTestJSON-1459419222] Lock "5cbf6860-028a-46b8-9f4a-4107956a3570" released by "do_terminate_instance" :: held 3.881s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:55.518 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:55.556 INFO nova.compute.manager [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Starting instance... 2015-08-07 17:49:55.769 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:55.770 DEBUG nova.compute.resource_tracker [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:49:55.777 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:49:55.777 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 17:49:55.778 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 17:49:55.778 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:49:55.779 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] disk limit not specified, defaulting to unlimited 2015-08-07 17:49:55.803 DEBUG nova.compute.resources.vcpu [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:49:55.804 DEBUG nova.compute.resources.vcpu [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:49:55.805 INFO nova.compute.claims [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Claim successful 2015-08-07 17:49:56.064 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" released by "instance_claim" :: held 0.295s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:56.267 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:56.345 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" released by "update_usage" :: held 0.078s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:56.346 DEBUG nova.compute.utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:49:56.349 13318 DEBUG nova.compute.manager [-] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:49:56.352 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-2164dc89-6e1d-45ae-a625-28be5a6a1a05" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:49:56.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:56.574 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:56.582 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:56.582 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:56.583 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:49:57.154 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:49:57.165 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:49:57.166 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:57.360 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:49:57.367 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:58.182 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Cloned VDI OpaqueRef:5dd43be3-7ce2-ed1e-10fe-4f35610e0df4 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:49:58.283 13318 DEBUG nova.network.base_api [-] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7b:cc:df', 'active': False, 'type': u'bridge', 'id': u'c96bc5cc-1844-4d56-8c3c-73405d2f5bca', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:49:58.307 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-2164dc89-6e1d-45ae-a625-28be5a6a1a05" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:49:58.308 13318 DEBUG nova.compute.manager [-] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7b:cc:df', 'active': False, 'type': u'bridge', 'id': u'c96bc5cc-1844-4d56-8c3c-73405d2f5bca', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:49:58.674 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.307s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:49:58.674 INFO nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Image creation data, cacheable: True, downloaded: False duration: 1.31 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:49:59.146 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:59.290 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:59.449 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:49:59.459 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:49:59.459 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:49:59.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:49:59.550 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:49:59.550 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:49:59.622 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Creating disk-type VBD for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:5dd43be3-7ce2-ed1e-10fe-4f35610e0df4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:59.631 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Created VBD OpaqueRef:bb762cb9-2a41-9873-9d9e-fc51a5d439d1 for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:5dd43be3-7ce2-ed1e-10fe-4f35610e0df4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:59.712 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:49:59.713 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:49:59.946 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Created VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:49:59.949 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:49:59.957 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Created VBD OpaqueRef:3acdde2a-e501-5418-afbb-8fbe32b7f018 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:49:59.958 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Plugging VBD OpaqueRef:3acdde2a-e501-5418-afbb-8fbe32b7f018 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:49:59.959 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:00.891 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 0.932s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:00.891 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Plugging VBD OpaqueRef:3acdde2a-e501-5418-afbb-8fbe32b7f018 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:50:00.894 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] VBD OpaqueRef:3acdde2a-e501-5418-afbb-8fbe32b7f018 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:50:00.966 WARNING nova.virt.configdrive [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:50:00.967 DEBUG nova.objects.instance [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lazy-loading `ec2_ids' on Instance uuid 2164dc89-6e1d-45ae-a625-28be5a6a1a05 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:00.994 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Running cmd (subprocess): genisoimage -o /tmp/tmpGpv_xZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjhBY5P execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:01.075 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] CMD "genisoimage -o /tmp/tmpGpv_xZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpjhBY5P" returned: 0 in 0.080s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:01.081 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpGpv_xZ/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:02.450 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:02.542 INFO nova.compute.manager [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Starting instance... 2015-08-07 17:50:02.800 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:02.801 DEBUG nova.compute.resource_tracker [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:50:02.807 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:50:02.807 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:50:02.809 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:50:02.809 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:50:02.809 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] disk limit not specified, defaulting to unlimited 2015-08-07 17:50:02.839 DEBUG nova.compute.resources.vcpu [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:50:02.839 DEBUG nova.compute.resources.vcpu [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:50:02.840 INFO nova.compute.claims [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Claim successful 2015-08-07 17:50:03.222 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" released by "instance_claim" :: held 0.421s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:03.473 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:03.595 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" released by "update_usage" :: held 0.122s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:03.596 DEBUG nova.compute.utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:50:03.600 13318 DEBUG nova.compute.manager [-] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:50:03.601 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-73d50eac-e688-4e00-9568-ef698e2e69ac" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:50:04.235 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:50:04.246 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:50:04.246 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:04.501 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:50:04.513 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:05.118 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:05.342 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpGpv_xZ/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 4.262s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:05.344 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:05.700 DEBUG oslo_concurrency.processutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.356s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:05.703 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Destroying VBD for VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:50:05.704 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:06.012 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Cloned VDI OpaqueRef:168f04b0-ee50-c39d-9479-03d4e1346840 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:50:06.138 13318 DEBUG nova.network.base_api [-] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:91:7e:23', 'active': False, 'type': u'bridge', 'id': u'67e25a12-c9d5-41ec-bcb0-2ae59f8193ae', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:50:06.171 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-73d50eac-e688-4e00-9568-ef698e2e69ac" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:50:06.171 13318 DEBUG nova.compute.manager [-] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:91:7e:23', 'active': False, 'type': u'bridge', 'id': u'67e25a12-c9d5-41ec-bcb0-2ae59f8193ae', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:50:06.413 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.709s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.421 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Destroying VBD for VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:50:06.422 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Creating disk-type VBD for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:06.429 DEBUG nova.virt.xenapi.vm_utils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Created VBD OpaqueRef:11177a6f-7ac9-2732-6177-59ed51773314 for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:179268bb-66c4-0854-9eba-f8952554f06a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:06.430 DEBUG nova.objects.instance [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lazy-loading `pci_devices' on Instance uuid 2164dc89-6e1d-45ae-a625-28be5a6a1a05 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:06.546 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:06.568 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.055s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.569 INFO nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Image creation data, cacheable: True, downloaded: False duration: 2.07 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:50:06.756 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:06.757 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.758 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:06.765 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "store_auto_disk_config" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.767 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Injecting hostname (tempest.common.compute-instance-1442041655) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:50:06.767 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:06.778 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.778 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:50:06.779 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:06.947 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "update_nwinfo" :: held 0.168s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:06.948 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:07.112 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:07.146 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:50:07.153 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:50:07.162 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Created VIF OpaqueRef:1e8f68d9-7ac8-99a6-48e4-13c9bf51e044, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:50:07.163 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:07.268 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:07.312 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:50:07.455 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:50:07.465 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:50:07.466 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:07.644 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Creating disk-type VBD for VM OpaqueRef:d9243f09-d94f-ad69-6361-fa99b73f0878, VDI OpaqueRef:168f04b0-ee50-c39d-9479-03d4e1346840 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:07.652 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Created VBD OpaqueRef:cb45efd4-36bb-90b5-e1fe-0478df972707 for VM OpaqueRef:d9243f09-d94f-ad69-6361-fa99b73f0878, VDI OpaqueRef:168f04b0-ee50-c39d-9479-03d4e1346840. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:07.969 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Created VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:50:07.973 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:07.986 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Created VBD OpaqueRef:01a84e23-36e7-268e-72d5-1d4140595905 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:07.987 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Plugging VBD OpaqueRef:01a84e23-36e7-268e-72d5-1d4140595905 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:50:07.988 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:09.169 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.182s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:09.170 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Plugging VBD OpaqueRef:01a84e23-36e7-268e-72d5-1d4140595905 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:50:09.175 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] VBD OpaqueRef:01a84e23-36e7-268e-72d5-1d4140595905 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:50:09.269 WARNING nova.virt.configdrive [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:50:09.270 DEBUG nova.objects.instance [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lazy-loading `ec2_ids' on Instance uuid 73d50eac-e688-4e00-9568-ef698e2e69ac obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:09.300 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Running cmd (subprocess): genisoimage -o /tmp/tmpvbYbS1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpL7kLdh execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:09.408 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] CMD "genisoimage -o /tmp/tmpvbYbS1/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpL7kLdh" returned: 0 in 0.108s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:09.415 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpvbYbS1/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:13.029 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpvbYbS1/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.614s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:13.033 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:13.342 DEBUG oslo_concurrency.processutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.308s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:13.345 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Destroying VBD for VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:50:13.347 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:13.858 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:50:13.877 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:14.094 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:50:14.095 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:50:14.095 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:14.100 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenstore-2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.101 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:14.186 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.838s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.193 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Destroying VBD for VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:50:14.193 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Creating disk-type VBD for VM OpaqueRef:d9243f09-d94f-ad69-6361-fa99b73f0878, VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:14.200 DEBUG nova.virt.xenapi.vm_utils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Created VBD OpaqueRef:3f03a89f-3446-89c5-b19f-20f0a95d810b for VM OpaqueRef:d9243f09-d94f-ad69-6361-fa99b73f0878, VDI OpaqueRef:98f6dd94-5789-737e-aa83-092800a9f2a3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:14.201 DEBUG nova.objects.instance [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lazy-loading `pci_devices' on Instance uuid 73d50eac-e688-4e00-9568-ef698e2e69ac obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:14.293 DEBUG nova.virt.xenapi.vmops [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:14.305 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:14.474 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:14.475 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.475 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:14.483 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.484 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Injecting hostname (tempest.common.compute-instance-887164447) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:50:14.484 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:14.493 DEBUG nova.compute.manager [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:50:14.496 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" released by "update_hostname" :: held 0.012s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.496 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:50:14.497 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:14.666 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" released by "update_nwinfo" :: held 0.169s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.666 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:14.803 DEBUG oslo_concurrency.lockutils [req-69968516-3c35-4dac-bbc2-c5cd6a1e148e tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "_locked_do_build_and_run_instance" :: held 19.285s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:14.882 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:50:14.889 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:50:14.900 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Created VIF OpaqueRef:1849a5cc-1c5f-e737-deb4-358dc5a24d98, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:50:14.901 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:15.070 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:50:15.076 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:18.291 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "do_reserve" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:18.315 DEBUG nova.compute.utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Using /dev/xvd instead of /dev/vd get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:50:18.353 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "do_reserve" :: held 0.062s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:18.773 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 19.061s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:18.967 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:50:18.968 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:50:18.968 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:50:18.969 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:19.169 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:50:19.170 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:50:19.224 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:50:19.224 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.256s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:19.225 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:19.256 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "do_attach_volume" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:19.257 INFO nova.compute.manager [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Attaching volume 01c32d1e-d0d1-4293-93c6-f09dca8c8f27 to /dev/xvdb 2015-08-07 17:50:19.260 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] REQ: curl -g -i -X GET http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27 -H "User-Agent: python-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}8aeedc24330524e08e7235ade1736ea926abc04a" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:50:19.754 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:50:19.773 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:19.960 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] RESP: [200] content-length: 921 x-compute-request-id: req-26d5896e-7d8b-43b5-8819-9309120f223f connection: keep-alive date: Fri, 07 Aug 2015 17:50:19 GMT content-type: application/json x-openstack-request-id: req-26d5896e-7d8b-43b5-8819-9309120f223f RESP BODY: {"volume": {"attachments": [], "links": [{"href": "http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27", "rel": "self"}, {"href": "http://192.168.33.1:8776/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27", "rel": "bookmark"}], "availability_zone": "nova", "encrypted": false, "os-volume-replication:extended_status": null, "volume_type": "lvmdriver-1", "snapshot_id": null, "id": "01c32d1e-d0d1-4293-93c6-f09dca8c8f27", "size": 1, "user_id": "a38ba21c302d46569fcf62b9308fedbd", "os-vol-tenant-attr:tenant_id": "a44dc889c66e4839966ea0d1def34b3c", "metadata": {}, "status": "attaching", "description": null, "multiattach": false, "source_volid": null, "consistencygroup_id": null, "name": "test", "bootable": "false", "created_at": "2015-08-07T17:50:16.000000", "os-volume-replication:driver_data": null, "replication_status": "disabled"}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:50:19.962 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}8aeedc24330524e08e7235ade1736ea926abc04a" -d '{"os-initialize_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:50:19.971 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:50:19.971 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:50:19.972 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:19.977 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "xenstore-73d50eac-e688-4e00-9568-ef698e2e69ac" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:19.978 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:20.281 DEBUG nova.virt.xenapi.vmops [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:20.478 DEBUG nova.compute.manager [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:50:20.748 DEBUG oslo_concurrency.lockutils [req-1bb20631-2dc4-4dee-bc46-e7be9038ebba tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac" released by "_locked_do_build_and_run_instance" :: held 18.298s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:22.126 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] RESP: [200] content-length: 449 x-compute-request-id: req-88edd14b-a6d6-496e-82cd-7cb65ed70560 connection: keep-alive date: Fri, 07 Aug 2015 17:50:22 GMT content-type: application/json x-openstack-request-id: req-88edd14b-a6d6-496e-82cd-7cb65ed70560 RESP BODY: {"connection_info": {"driver_volume_type": "iscsi", "data": {"auth_password": "iS58dXcVA6qHH2Xm", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-01c32d1e-d0d1-4293-93c6-f09dca8c8f27", "target_portal": "104.130.119.114:3260", "volume_id": "01c32d1e-d0d1-4293-93c6-f09dca8c8f27", "target_lun": 1, "access_mode": "rw", "auth_username": "nHq23kkiArmEQ8Jq3qp4", "auth_method": "CHAP"}}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:50:22.130 DEBUG nova.virt.xenapi.volume_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] (vol_id,host,port,iqn): (01c32d1e-d0d1-4293-93c6-f09dca8c8f27,104.130.119.114,3260,iqn.2010-10.org.openstack:volume-01c32d1e-d0d1-4293-93c6-f09dca8c8f27) _parse_volume_info /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:80 2015-08-07 17:50:22.134 DEBUG nova.virt.xenapi.volume_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Introducing SR tempSR-01c32d1e-d0d1-4293-93c6-f09dca8c8f27 introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:119 2015-08-07 17:50:22.139 DEBUG nova.virt.xenapi.volume_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Creating PBD for SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:126 2015-08-07 17:50:22.150 DEBUG nova.virt.xenapi.volume_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Plugging SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:129 2015-08-07 17:50:25.086 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:27.940 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:27.941 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:27.941 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:27.943 INFO nova.compute.manager [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Terminating instance 2015-08-07 17:50:27.944 INFO nova.virt.xenapi.vmops [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Destroying VM 2015-08-07 17:50:27.952 DEBUG nova.virt.xenapi.vm_utils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:50:28.223 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:28.224 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 19.29 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:29.616 DEBUG nova.virt.xenapi.vmops [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:50:29.627 DEBUG nova.virt.xenapi.vm_utils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] VDI 9ca8d1a2-31b5-4575-9df4-d4738afaa81a is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:50:29.633 DEBUG nova.virt.xenapi.vm_utils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] VDI a490baa9-f27b-4e28-9724-5927bb530a7d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:50:30.156 DEBUG nova.virt.xenapi.vmops [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:50:30.167 DEBUG nova.virt.xenapi.vm_utils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:50:30.167 DEBUG nova.compute.manager [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:50:31.349 DEBUG nova.compute.manager [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:50:01Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=79,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=73d50eac-e688-4e00-9568-ef698e2e69ac,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:50:03Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:50:31.504 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:31.506 DEBUG nova.objects.instance [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lazy-loading `numa_topology' on Instance uuid 73d50eac-e688-4e00-9568-ef698e2e69ac obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:31.575 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "compute_resources" released by "update_usage" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:31.802 DEBUG oslo_concurrency.lockutils [req-19d4dd8c-9710-4ac7-946c-44e93e56d360 tempest-ServerMetadataNegativeTestJSON-356735001 tempest-ServerMetadataNegativeTestJSON-525392375] Lock "73d50eac-e688-4e00-9568-ef698e2e69ac" released by "do_terminate_instance" :: held 3.862s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:35.083 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:38.922 DEBUG nova.virt.xenapi.volumeops [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Connect volume to hypervisor: {u'access_mode': u'rw', u'target_discovered': False, u'encrypted': False, u'qos_specs': None, u'target_iqn': u'iqn.2010-10.org.openstack:volume-01c32d1e-d0d1-4293-93c6-f09dca8c8f27', u'target_portal': u'104.130.119.114:3260', u'volume_id': u'01c32d1e-d0d1-4293-93c6-f09dca8c8f27', u'target_lun': 1, u'auth_password': u'iS58dXcVA6qHH2Xm', u'auth_username': u'nHq23kkiArmEQ8Jq3qp4', u'auth_method': u'CHAP'} _connect_hypervisor_to_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:94 2015-08-07 17:50:38.932 DEBUG nova.virt.xenapi.volume_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] {'sm_config': {'LUNid': '1', 'SCSIid': '33000000100000001'}, 'managed': False, 'snapshots': [], 'allowed_operations': ['forget', 'destroy', 'copy', 'snapshot'], 'on_boot': 'persist', 'name_description': '', 'read_only': False, 'uuid': 'e55110f1-56b1-d008-537d-c3081511f123', 'storage_lock': False, 'name_label': '', 'tags': [], 'location': 'e55110f1-56b1-d008-537d-c3081511f123', 'metadata_of_pool': 'OpaqueRef:NULL', 'type': 'user', 'sharable': False, 'snapshot_time': , 'parent': 'OpaqueRef:NULL', 'missing': False, 'xenstore_data': {}, 'crash_dumps': [], 'virtual_size': '1073741824', 'is_a_snapshot': False, 'current_operations': {}, 'snapshot_of': 'OpaqueRef:NULL', 'SR': 'OpaqueRef:deb38ba2-c81e-8c7c-1fc4-12ec907f4937', 'other_config': {}, 'physical_utilisation': '0', 'allow_caching': False, 'metadata_latest': False, 'VBDs': []} introduce_vdi /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:176 2015-08-07 17:50:39.408 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:39.452 INFO nova.compute.manager [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Starting instance... 2015-08-07 17:50:39.462 INFO nova.virt.xenapi.volumeops [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Connected volume (vdi_uuid): e55110f1-56b1-d008-537d-c3081511f123 2015-08-07 17:50:39.462 DEBUG nova.virt.xenapi.volumeops [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Attach_volume vdi: OpaqueRef:1eaf913e-f17c-ef9d-3440-0e9676095253 vm: OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:114 2015-08-07 17:50:39.462 DEBUG nova.virt.xenapi.vm_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Creating disk-type VBD for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:1eaf913e-f17c-ef9d-3440-0e9676095253 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:39.471 DEBUG nova.virt.xenapi.vm_utils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Created VBD OpaqueRef:ab4a9281-296f-eb8f-5e7a-0b234f47842f for VM OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc, VDI OpaqueRef:1eaf913e-f17c-ef9d-3440-0e9676095253. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:39.477 DEBUG nova.virt.xenapi.volumeops [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Plugging VBD: OpaqueRef:ab4a9281-296f-eb8f-5e7a-0b234f47842f _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:124 2015-08-07 17:50:39.478 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:39.639 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:39.640 DEBUG nova.compute.resource_tracker [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:50:39.648 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:50:39.649 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 17:50:39.650 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 17:50:39.650 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:50:39.651 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] disk limit not specified, defaulting to unlimited 2015-08-07 17:50:39.675 DEBUG nova.compute.resources.vcpu [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:50:39.676 DEBUG nova.compute.resources.vcpu [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:50:39.676 INFO nova.compute.claims [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Claim successful 2015-08-07 17:50:39.952 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" released by "instance_claim" :: held 0.314s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:40.127 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:40.221 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" released by "update_usage" :: held 0.093s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:40.222 DEBUG nova.compute.utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:50:40.227 13318 DEBUG nova.compute.manager [-] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:50:40.229 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-63bde6ef-e9af-4de5-a116-60694ef11c08" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:50:40.581 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc" released by "synchronized_plug" :: held 1.102s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:40.582 INFO nova.virt.xenapi.volumeops [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Dev 1 attached to instance instance-0000004c 2015-08-07 17:50:40.635 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}8aeedc24330524e08e7235ade1736ea926abc04a" -d '{"os-attach": {"instance_uuid": "2164dc89-6e1d-45ae-a625-28be5a6a1a05", "mountpoint": "/dev/xvdb", "mode": "rw"}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:50:40.727 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:50:40.739 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:50:40.740 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:40.971 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:50:40.978 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:41.826 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Cloned VDI OpaqueRef:a8781eed-f4a1-fc4e-8666-e99d2e059942 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:50:42.331 13318 DEBUG nova.network.base_api [-] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:71:b7', 'active': False, 'type': u'bridge', 'id': u'bcc4d727-018d-4a93-81eb-f164128f0a9e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:50:42.367 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-63bde6ef-e9af-4de5-a116-60694ef11c08" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:50:42.367 13318 DEBUG nova.compute.manager [-] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:32:71:b7', 'active': False, 'type': u'bridge', 'id': u'bcc4d727-018d-4a93-81eb-f164128f0a9e', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:50:42.383 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.404s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:42.383 INFO nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Image creation data, cacheable: True, downloaded: False duration: 1.41 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:50:42.661 DEBUG keystoneclient.session [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] RESP: [202] date: Fri, 07 Aug 2015 17:50:42 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-3a521d58-78d0-4058-8677-d7496837c0d2 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:50:42.700 DEBUG oslo_concurrency.lockutils [req-55562b5a-ff48-40ef-ad1b-76fd8c61e84a tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "do_attach_volume" :: held 23.444s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:42.910 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:43.054 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:43.220 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:50:43.230 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:50:43.231 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:43.420 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Creating disk-type VBD for VM OpaqueRef:8262a5e5-6934-4dd5-8fb6-721c3e18749f, VDI OpaqueRef:a8781eed-f4a1-fc4e-8666-e99d2e059942 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:43.430 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Created VBD OpaqueRef:91215e5d-17cb-bf2d-9b03-984a5f5f091c for VM OpaqueRef:8262a5e5-6934-4dd5-8fb6-721c3e18749f, VDI OpaqueRef:a8781eed-f4a1-fc4e-8666-e99d2e059942. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:43.730 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Created VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:50:43.733 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:43.742 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Created VBD OpaqueRef:edb38b0c-f17e-3067-3db7-3ffd84bbbbd9 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:43.743 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Plugging VBD OpaqueRef:edb38b0c-f17e-3067-3db7-3ffd84bbbbd9 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:50:43.743 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:44.761 INFO nova.compute.manager [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Detach volume 01c32d1e-d0d1-4293-93c6-f09dca8c8f27 from mountpoint /dev/xvdb 2015-08-07 17:50:44.766 DEBUG nova.virt.xenapi.volumeops [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Detach_volume: instance-0000004c, /dev/xvdb detach_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:134 2015-08-07 17:50:44.781 DEBUG oslo_concurrency.lockutils [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:44.865 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.122s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:44.866 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Plugging VBD OpaqueRef:edb38b0c-f17e-3067-3db7-3ffd84bbbbd9 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:50:44.871 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] VBD OpaqueRef:edb38b0c-f17e-3067-3db7-3ffd84bbbbd9 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:50:44.945 WARNING nova.virt.configdrive [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:50:44.946 DEBUG nova.objects.instance [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lazy-loading `ec2_ids' on Instance uuid 63bde6ef-e9af-4de5-a116-60694ef11c08 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:44.981 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Running cmd (subprocess): genisoimage -o /tmp/tmpp4zutZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmphodjn_ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:45.088 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] CMD "genisoimage -o /tmp/tmpp4zutZ/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmphodjn_" returned: 0 in 0.107s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:45.102 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpp4zutZ/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:45.185 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:45.678 DEBUG oslo_concurrency.lockutils [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "xenapi-vbd-OpaqueRef:da0d18cf-08fe-e8c8-7dbb-d03b309f0adc" released by "synchronized_unplug" :: held 0.897s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:45.691 DEBUG nova.virt.xenapi.volume_utils [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Forgetting SR... forget_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:237 2015-08-07 17:50:47.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:47.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:47.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:47.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:48.641 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpp4zutZ/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.539s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:48.643 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:50:48.930 DEBUG oslo_concurrency.processutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.287s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:50:48.933 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Destroying VBD for VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:50:48.934 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:49.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:49.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:49.535 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.601s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:49.542 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Destroying VBD for VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:50:49.543 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Creating disk-type VBD for VM OpaqueRef:8262a5e5-6934-4dd5-8fb6-721c3e18749f, VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:50:49.551 DEBUG nova.virt.xenapi.vm_utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Created VBD OpaqueRef:b1919a2c-611a-e0b9-c350-22b6b4ccc845 for VM OpaqueRef:8262a5e5-6934-4dd5-8fb6-721c3e18749f, VDI OpaqueRef:5d99004f-f210-290e-dbef-9e792fd6aa56. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:50:49.552 DEBUG nova.objects.instance [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lazy-loading `pci_devices' on Instance uuid 63bde6ef-e9af-4de5-a116-60694ef11c08 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:49.657 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:49.851 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:49.852 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:49.853 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:49.859 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" released by "store_auto_disk_config" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:49.859 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Injecting hostname (tempest.common.compute-instance-1234520138) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:50:49.860 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:49.865 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:49.866 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:50:49.866 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:49.996 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" released by "update_nwinfo" :: held 0.130s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:49.997 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:50.161 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:50:50.166 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:50:50.173 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Created VIF OpaqueRef:20deb528-7a4d-40c9-6c39-c5f0a78c9689, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:50:50.174 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:50.347 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:50:51.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:51.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:50:51.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:50:51.575 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:50:51.577 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-2164dc89-6e1d-45ae-a625-28be5a6a1a05" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:50:51.577 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 2164dc89-6e1d-45ae-a625-28be5a6a1a05 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:51.824 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7b:cc:df', 'active': False, 'type': u'bridge', 'id': u'c96bc5cc-1844-4d56-8c3c-73405d2f5bca', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:50:51.852 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-2164dc89-6e1d-45ae-a625-28be5a6a1a05" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:50:51.853 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:50:51.854 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:53.854 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:53.855 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:50:53.855 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:54.818 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:50:54.837 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:55.023 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:50:55.024 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:50:55.024 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:55.029 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "xenstore-63bde6ef-e9af-4de5-a116-60694ef11c08" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:55.029 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:55.104 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:55.193 DEBUG nova.virt.xenapi.vmops [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:50:55.370 DEBUG nova.compute.manager [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:50:55.536 DEBUG nova.compute.utils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Unexpected task state: expecting (u'spawning',) but the actual state is deleting notify_about_instance_usage /opt/stack/new/nova/nova/compute/utils.py:283 2015-08-07 17:50:55.537 DEBUG nova.compute.manager [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Instance disappeared during build. _do_build_and_run_instance /opt/stack/new/nova/nova/compute/manager.py:1929 2015-08-07 17:50:55.538 DEBUG nova.compute.manager [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:50:55.887 DEBUG oslo_concurrency.lockutils [req-82680149-d423-4dfa-8dd5-f6d1ad5fb516 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08" released by "_locked_do_build_and_run_instance" :: held 16.478s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:55.888 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08" acquired by "do_terminate_instance" :: waited 13.152s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:55.889 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:55.889 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:55.891 INFO nova.compute.manager [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Terminating instance 2015-08-07 17:50:55.893 INFO nova.virt.xenapi.vmops [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Destroying VM 2015-08-07 17:50:55.901 DEBUG nova.virt.xenapi.vm_utils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:50:57.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:57.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:57.985 DEBUG nova.virt.xenapi.vmops [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:50:57.997 DEBUG nova.virt.xenapi.vm_utils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] VDI 684ad7e5-7949-4a86-b532-57f5cbb7f698 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:50:58.005 DEBUG nova.virt.xenapi.vm_utils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] VDI ef108dd8-68ab-4988-a5b0-8a6b4546cd0c is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:50:58.403 INFO nova.virt.xenapi.volumeops [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Mountpoint /dev/xvdb detached from instance instance-0000004c 2015-08-07 17:50:58.406 DEBUG keystoneclient.session [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}8aeedc24330524e08e7235ade1736ea926abc04a" -d '{"os-terminate_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:50:58.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:50:58.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:50:58.564 DEBUG nova.virt.xenapi.vmops [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:50:58.576 DEBUG nova.virt.xenapi.vm_utils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:50:58.579 DEBUG nova.compute.manager [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:50:58.621 DEBUG keystoneclient.session [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] RESP: [202] date: Fri, 07 Aug 2015 17:50:58 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-6feb6e9b-813c-47a4-8cd0-e3dedcbd98c9 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:50:58.667 DEBUG keystoneclient.session [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/a44dc889c66e4839966ea0d1def34b3c/volumes/01c32d1e-d0d1-4293-93c6-f09dca8c8f27/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}8aeedc24330524e08e7235ade1736ea926abc04a" -d '{"os-detach": {"attachment_id": null}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 17:50:59.260 DEBUG nova.compute.manager [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:50:39Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=81,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=63bde6ef-e9af-4de5-a116-60694ef11c08,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:50:40Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:50:59.410 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:50:59.411 DEBUG nova.objects.instance [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lazy-loading `numa_topology' on Instance uuid 63bde6ef-e9af-4de5-a116-60694ef11c08 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:50:59.484 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "compute_resources" released by "update_usage" :: held 0.074s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:50:59.706 DEBUG oslo_concurrency.lockutils [req-b22b74d9-cac7-4d00-83fc-1c197871f167 tempest-ServerPersonalityTestJSON-1134914142 tempest-ServerPersonalityTestJSON-167288719] Lock "63bde6ef-e9af-4de5-a116-60694ef11c08" released by "do_terminate_instance" :: held 3.818s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:00.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:00.559 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:51:00.559 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:51:00.696 DEBUG keystoneclient.session [req-79a368e9-cf81-42e4-8e99-3d26f9c3b421 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] RESP: [202] date: Fri, 07 Aug 2015 17:51:00 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-89c56020-b8d3-4290-82e7-efa9460a8f91 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 17:51:00.740 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:00.741 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:51:01.128 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.389s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:01.411 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:51:01.412 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:51:01.412 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=16GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:51:01.413 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:01.692 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:51:01.693 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:51:01.897 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:51:01.898 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.485s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:01.898 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:03.511 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:03.512 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:03.512 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:03.514 INFO nova.compute.manager [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Terminating instance 2015-08-07 17:51:03.515 INFO nova.virt.xenapi.vmops [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Destroying VM 2015-08-07 17:51:03.522 DEBUG nova.virt.xenapi.vm_utils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:51:05.104 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:05.296 DEBUG nova.virt.xenapi.vmops [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:51:05.305 DEBUG nova.virt.xenapi.vm_utils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] VDI 925904bf-cc32-44d6-b260-769046a5afca is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:51:05.314 DEBUG nova.virt.xenapi.vm_utils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] VDI b4fa61be-6f02-4ce2-9559-4f87e5ebd6e4 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:51:05.900 DEBUG nova.virt.xenapi.vmops [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:51:05.909 DEBUG nova.virt.xenapi.vm_utils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:51:05.910 DEBUG nova.compute.manager [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:51:07.184 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:07.224 DEBUG nova.compute.manager [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:49:55Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=78,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=2164dc89-6e1d-45ae-a625-28be5a6a1a05,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:49:56Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:51:07.234 INFO nova.compute.manager [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Starting instance... 2015-08-07 17:51:07.450 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:07.451 DEBUG nova.objects.instance [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lazy-loading `numa_topology' on Instance uuid 2164dc89-6e1d-45ae-a625-28be5a6a1a05 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:51:07.533 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "compute_resources" released by "update_usage" :: held 0.083s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:07.540 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" acquired by "instance_claim" :: waited 0.082s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:07.541 DEBUG nova.compute.resource_tracker [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:51:07.547 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:51:07.547 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 17:51:07.548 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 17:51:07.548 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:51:07.549 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] disk limit not specified, defaulting to unlimited 2015-08-07 17:51:07.579 DEBUG nova.compute.resources.vcpu [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:51:07.580 DEBUG nova.compute.resources.vcpu [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:51:07.580 INFO nova.compute.claims [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Claim successful 2015-08-07 17:51:07.774 DEBUG oslo_concurrency.lockutils [req-ccb4899c-d39b-429d-a04e-2d68c15a9fd2 tempest-AttachVolumeTestJSON-1615353033 tempest-AttachVolumeTestJSON-1297584599] Lock "2164dc89-6e1d-45ae-a625-28be5a6a1a05" released by "do_terminate_instance" :: held 4.264s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:07.817 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" released by "instance_claim" :: held 0.277s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:07.986 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:08.057 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" released by "update_usage" :: held 0.071s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:08.058 DEBUG nova.compute.utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:51:08.063 13318 DEBUG nova.compute.manager [-] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:51:08.064 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-9523109c-a66d-45b8-92dd-1083ef01ae2f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:51:08.460 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:51:08.471 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:51:08.472 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:08.690 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:51:08.702 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:09.542 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Cloned VDI OpaqueRef:9f64aa1f-d8a9-6063-1d86-71127ad3021c from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:51:10.020 13318 DEBUG nova.network.base_api [-] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:80:52:e5', 'active': False, 'type': u'bridge', 'id': u'04c08598-8b62-478b-a14d-28becba5faf8', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:51:10.046 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-9523109c-a66d-45b8-92dd-1083ef01ae2f" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:51:10.046 13318 DEBUG nova.compute.manager [-] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:80:52:e5', 'active': False, 'type': u'bridge', 'id': u'04c08598-8b62-478b-a14d-28becba5faf8', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:51:10.097 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.395s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:10.097 INFO nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Image creation data, cacheable: True, downloaded: False duration: 1.41 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:51:10.626 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:10.826 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:10.897 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:10.898 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 16.62 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:11.000 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:51:11.011 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:51:11.011 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:11.177 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Creating disk-type VBD for VM OpaqueRef:c3599e51-cbf5-072a-64e6-6668e61b8424, VDI OpaqueRef:9f64aa1f-d8a9-6063-1d86-71127ad3021c ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:51:11.184 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Created VBD OpaqueRef:b2eb29d2-4e18-2477-3ef7-616c79eeaa0c for VM OpaqueRef:c3599e51-cbf5-072a-64e6-6668e61b8424, VDI OpaqueRef:9f64aa1f-d8a9-6063-1d86-71127ad3021c. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:51:11.450 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Created VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:51:11.453 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:51:11.463 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Created VBD OpaqueRef:b8b5bcd4-8b9f-c495-fdc9-e438e6784244 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:51:11.463 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Plugging VBD OpaqueRef:b8b5bcd4-8b9f-c495-fdc9-e438e6784244 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:51:11.464 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:12.523 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.059s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:12.523 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Plugging VBD OpaqueRef:b8b5bcd4-8b9f-c495-fdc9-e438e6784244 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:51:12.527 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] VBD OpaqueRef:b8b5bcd4-8b9f-c495-fdc9-e438e6784244 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:51:12.602 WARNING nova.virt.configdrive [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:51:12.603 DEBUG nova.objects.instance [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lazy-loading `ec2_ids' on Instance uuid 9523109c-a66d-45b8-92dd-1083ef01ae2f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:51:12.636 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Running cmd (subprocess): genisoimage -o /tmp/tmpH4mNpU/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqyCxYO execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:51:12.727 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] CMD "genisoimage -o /tmp/tmpH4mNpU/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpqyCxYO" returned: 0 in 0.092s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:51:12.733 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpH4mNpU/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:51:15.148 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:16.183 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpH4mNpU/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 3.450s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:51:16.185 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:51:16.545 DEBUG oslo_concurrency.processutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.359s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:51:16.548 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Destroying VBD for VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:51:16.549 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:17.183 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.634s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:17.190 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Destroying VBD for VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:51:17.191 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Creating disk-type VBD for VM OpaqueRef:c3599e51-cbf5-072a-64e6-6668e61b8424, VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:51:17.198 DEBUG nova.virt.xenapi.vm_utils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Created VBD OpaqueRef:ab3248c8-4036-ed2b-ab6c-68f37443fa2d for VM OpaqueRef:c3599e51-cbf5-072a-64e6-6668e61b8424, VDI OpaqueRef:4b999796-d5c0-08de-0ecc-72267e5b2219. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:51:17.199 DEBUG nova.objects.instance [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lazy-loading `pci_devices' on Instance uuid 9523109c-a66d-45b8-92dd-1083ef01ae2f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:51:17.297 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:17.519 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:17.520 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:17.521 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:17.529 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "store_auto_disk_config" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:17.530 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Injecting hostname (tempest.common.compute-instance-1645973810) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:51:17.530 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:17.538 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "update_hostname" :: held 0.007s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:17.540 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:51:17.540 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:17.712 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "update_nwinfo" :: held 0.171s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:17.712 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:17.929 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:51:17.935 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:51:17.942 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Created VIF OpaqueRef:b887b367-dc2d-9610-1345-e8efaecc0fe2, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:51:17.943 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:18.130 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:51:22.702 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:51:22.715 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:23.115 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:51:23.115 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:51:23.116 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:23.120 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "xenstore-9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "update_hostname" :: held 0.004s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:23.121 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:23.294 DEBUG nova.virt.xenapi.vmops [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:51:23.709 DEBUG nova.compute.manager [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:51:24.002 DEBUG oslo_concurrency.lockutils [req-8d9df4f5-74fd-4e85-b5ac-123ddad0ed9d tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "_locked_do_build_and_run_instance" :: held 16.817s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:24.912 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:24.913 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:24.913 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:24.915 INFO nova.compute.manager [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Terminating instance 2015-08-07 17:51:24.916 INFO nova.virt.xenapi.vmops [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Destroying VM 2015-08-07 17:51:24.923 DEBUG nova.virt.xenapi.vm_utils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:51:25.111 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:26.728 DEBUG nova.virt.xenapi.vmops [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:51:26.745 DEBUG nova.virt.xenapi.vm_utils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] VDI b2bafd3f-7c26-4231-a544-f8e281508016 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:51:26.753 DEBUG nova.virt.xenapi.vm_utils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] VDI 7a39b9c0-1d40-49b4-b953-aa543d3ffad1 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:51:27.355 DEBUG nova.virt.xenapi.vmops [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:51:27.366 DEBUG nova.virt.xenapi.vm_utils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:51:27.367 DEBUG nova.compute.manager [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:51:27.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:27.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:51:27.784 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 14 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:51:27.784 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 63bde6ef-e9af-4de5-a116-60694ef11c08] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:27.970 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 73d50eac-e688-4e00-9568-ef698e2e69ac] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:28.176 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 2164dc89-6e1d-45ae-a625-28be5a6a1a05] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:28.367 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5cbf6860-028a-46b8-9f4a-4107956a3570] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:28.773 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: e0cc4887-0c12-4012-a49e-55fdf90fda04] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:28.944 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 6f6a5a93-63ed-4d42-be4c-e81077af680b] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:28.995 DEBUG nova.compute.manager [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:51:06Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=82,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=9523109c-a66d-45b8-92dd-1083ef01ae2f,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:51:08Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:51:29.104 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 8709c80f-ab24-4aa9-ad2c-64ea4c51ff51] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:29.150 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:51:29.151 DEBUG nova.objects.instance [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lazy-loading `numa_topology' on Instance uuid 9523109c-a66d-45b8-92dd-1083ef01ae2f obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:51:29.232 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "compute_resources" released by "update_usage" :: held 0.081s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:29.264 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 535c4da4-7eef-4c5e-b79f-4647f6857432] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:29.408 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 5efe11cd-8475-4ce1-b0c5-86c4d930d74f] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:29.459 DEBUG oslo_concurrency.lockutils [req-a0898f2d-5161-4242-968c-062eb2aaca89 tempest-VirtualInterfacesTestJSON-1588532739 tempest-VirtualInterfacesTestJSON-887492551] Lock "9523109c-a66d-45b8-92dd-1083ef01ae2f" released by "do_terminate_instance" :: held 4.547s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:51:29.550 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 3d063a71-d4d9-49d1-ae99-79d2dc71c6d3] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:29.749 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b4eac7df-4935-4b77-8307-8c8cabe2c038] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:29.912 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: b197c990-eecd-403b-b8d7-9e57e7053a16] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:30.042 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: af2ef72d-4895-4de0-bd40-aaa2ac498091] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:30.186 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 69bd2eeb-96e0-42ba-9643-c7c085279a18] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:51:30.339 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 19.17 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:35.105 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:45.110 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:49.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:49.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:49.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:49.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:51.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:51.546 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:53.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:53.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:51:53.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:51:53.577 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:51:53.578 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:55.108 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:55.578 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:55.579 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:51:55.580 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:51:59.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:51:59.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:00.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:00.546 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:52:00.551 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:52:00.712 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:52:00.713 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:52:01.032 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.320s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:52:01.212 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:52:01.212 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:52:01.213 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:52:01.213 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:52:01.336 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:52:01.337 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:52:01.392 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:52:01.393 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.180s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:52:01.393 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:01.394 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.12 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:01.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:01.571 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:05.103 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:09.578 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:09.579 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 39.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:15.113 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:25.101 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:35.104 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:45.108 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:49.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:49.522 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:51.514 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:51.515 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:51.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:51.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:53.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:53.522 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:52:53.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:52:53.553 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:52:53.555 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:55.122 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:52:57.557 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:52:57.557 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:52:57.558 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:00.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:00.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:00.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:02.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:02.552 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:53:02.552 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:53:02.688 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:53:02.688 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:53:02.951 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.263s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:53:03.082 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:53:03.083 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:53:03.084 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:53:03.084 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:53:03.182 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:53:03.183 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:53:03.236 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:53:03.237 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.153s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:53:03.237 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:05.104 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:11.234 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:11.235 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 38.29 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:15.111 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:25.138 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:35.109 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:45.124 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:49.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:49.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:51.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:51.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:51.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:51.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:54.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:54.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:53:54.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:53:54.563 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:53:54.564 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:55.137 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:53:59.565 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:53:59.566 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:53:59.567 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:01.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:01.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:02.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:02.569 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:02.577 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:02.577 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:03.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:03.551 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:54:03.552 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:54:03.707 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:54:03.707 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:54:04.012 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.306s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:54:04.167 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:54:04.168 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:54:04.168 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=16GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:54:04.168 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:54:04.264 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:54:04.265 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:54:04.309 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:54:04.309 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.141s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:54:04.310 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:05.113 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:13.308 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:13.309 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 37.21 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:15.134 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:25.150 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:35.146 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:45.159 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:50.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:50.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:51.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:51.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:51.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:51.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:54.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:54:54.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:54:54.524 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:54:54.568 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:54:54.569 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:54:56.433 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:01.569 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:01.570 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:55:01.570 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:01.571 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:01.571 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:04.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:04.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:05.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:05.544 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:55:05.545 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:55:05.707 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:55:05.708 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:55:05.849 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.23 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:06.199 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.491s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:55:06.379 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:55:06.379 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:55:06.379 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=15GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:55:06.380 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:55:06.474 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:55:06.475 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:55:07.629 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:55:07.630 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.250s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:55:07.631 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:14.629 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:14.630 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:15.185 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:21.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:21.536 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 29.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:25.125 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:35.132 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:45.142 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:50.535 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:50.536 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:51.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:51.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:52.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:52.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:53.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:53.524 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 17:55:53.712 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:54.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:55:54.524 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:55:54.524 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:55:54.574 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:55:54.575 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:55:55.121 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:02.575 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:02.576 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:03.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:03.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:56:03.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:05.402 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:05.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:05.553 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:56:05.553 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:56:05.709 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:56:05.710 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:56:06.022 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.312s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:56:06.208 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:56:06.208 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:56:06.209 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=15GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:56:06.209 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:56:06.315 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:56:06.315 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:56:06.928 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:56:06.929 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.720s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:56:06.930 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:06.930 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:07.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:07.562 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:14.570 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:14.623 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 18.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:15.132 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:26.521 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.57 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:33.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:33.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 17:56:33.583 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 1 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 17:56:33.584 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 9523109c-a66d-45b8-92dd-1083ef01ae2f] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 17:56:34.547 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 16.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:36.675 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.41 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:45.123 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:51.542 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:51.542 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:52.515 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:52.516 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:53.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:53.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:54.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:56:54.523 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:56:54.524 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:56:54.568 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:56:54.569 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:56:55.141 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:02.569 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:02.570 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:03.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:03.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:57:03.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:05.148 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:06.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:06.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:07.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:07.554 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:57:07.554 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:57:07.726 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:07.726 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:57:08.034 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.309s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:08.206 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:57:08.207 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:57:08.207 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=14GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:57:08.207 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:08.355 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:57:08.355 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:57:08.419 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:57:08.421 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.213s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:08.422 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:15.145 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:15.928 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:15.967 INFO nova.compute.manager [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Starting instance... 2015-08-07 17:57:16.165 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:16.165 DEBUG nova.compute.resource_tracker [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:57:16.175 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:57:16.176 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 17:57:16.176 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 17:57:16.177 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:57:16.177 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] disk limit not specified, defaulting to unlimited 2015-08-07 17:57:16.199 DEBUG nova.compute.resources.vcpu [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:57:16.200 DEBUG nova.compute.resources.vcpu [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:57:16.201 INFO nova.compute.claims [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Claim successful 2015-08-07 17:57:16.495 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" released by "instance_claim" :: held 0.330s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:16.710 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:16.813 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" released by "update_usage" :: held 0.103s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:16.814 DEBUG nova.compute.utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:57:16.817 13318 DEBUG nova.compute.manager [-] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:57:16.817 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-4ac30f2c-18b7-42f7-8407-97f68e4ec552" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:57:17.246 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:57:17.259 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:57:17.260 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:17.420 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:17.421 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.07 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:17.447 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:57:17.456 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:18.260 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Cloned VDI OpaqueRef:a5d8d014-ed8c-0ac5-4a5f-c67d75095df6 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:57:18.798 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 1.342s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:18.799 INFO nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Image creation data, cacheable: True, downloaded: False duration: 1.35 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:57:18.864 13318 DEBUG nova.network.base_api [-] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:77:93:0a', 'active': False, 'type': u'bridge', 'id': u'b5baf029-076b-44bc-b96d-6e7db126821a', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:57:18.905 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-4ac30f2c-18b7-42f7-8407-97f68e4ec552" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:57:18.906 13318 DEBUG nova.compute.manager [-] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:77:93:0a', 'active': False, 'type': u'bridge', 'id': u'b5baf029-076b-44bc-b96d-6e7db126821a', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:57:19.433 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:19.789 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:19.957 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:57:19.967 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:57:19.967 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:20.137 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Creating disk-type VBD for VM OpaqueRef:4ea281ab-26ac-db80-f943-e3bbf87272b2, VDI OpaqueRef:a5d8d014-ed8c-0ac5-4a5f-c67d75095df6 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:57:20.143 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Created VBD OpaqueRef:094e8c01-f97e-196d-b326-9cc01b1c444a for VM OpaqueRef:4ea281ab-26ac-db80-f943-e3bbf87272b2, VDI OpaqueRef:a5d8d014-ed8c-0ac5-4a5f-c67d75095df6. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:57:20.414 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Created VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:57:20.417 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:57:20.429 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Created VBD OpaqueRef:32c7d0c9-a6ca-d7da-f39e-edd910962fe5 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:57:20.430 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Plugging VBD OpaqueRef:32c7d0c9-a6ca-d7da-f39e-edd910962fe5 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:57:20.431 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:21.601 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.171s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:21.602 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Plugging VBD OpaqueRef:32c7d0c9-a6ca-d7da-f39e-edd910962fe5 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:57:21.605 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] VBD OpaqueRef:32c7d0c9-a6ca-d7da-f39e-edd910962fe5 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:57:21.718 WARNING nova.virt.configdrive [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:57:21.719 DEBUG nova.objects.instance [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lazy-loading `ec2_ids' on Instance uuid 4ac30f2c-18b7-42f7-8407-97f68e4ec552 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:57:21.768 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Running cmd (subprocess): genisoimage -o /tmp/tmpayrfHh/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpeeuawq execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:57:21.959 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] CMD "genisoimage -o /tmp/tmpayrfHh/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpeeuawq" returned: 0 in 0.191s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:57:21.965 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpayrfHh/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:57:25.174 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:26.498 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:26.575 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 4ac30f2c-18b7-42f7-8407-97f68e4ec552 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 17:57:26.578 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 26.02 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:27.338 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpayrfHh/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 5.373s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:57:27.340 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:57:27.819 DEBUG oslo_concurrency.processutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.479s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:57:27.822 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Destroying VBD for VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 17:57:27.826 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:28.794 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.968s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:28.806 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Destroying VBD for VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 17:57:28.807 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Creating disk-type VBD for VM OpaqueRef:4ea281ab-26ac-db80-f943-e3bbf87272b2, VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:57:28.819 DEBUG nova.virt.xenapi.vm_utils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Created VBD OpaqueRef:360357aa-46a8-7b45-cda7-07a88894e71f for VM OpaqueRef:4ea281ab-26ac-db80-f943-e3bbf87272b2, VDI OpaqueRef:4b34f4cc-d6b2-a64a-1911-1e51699d8432. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:57:28.820 DEBUG nova.objects.instance [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lazy-loading `pci_devices' on Instance uuid 4ac30f2c-18b7-42f7-8407-97f68e4ec552 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:57:28.954 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:29.229 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:29.230 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:29.232 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:29.242 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "store_auto_disk_config" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:29.243 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Injecting hostname (tempest-volumesv1actionstest-instance-515087037) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 17:57:29.243 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:29.253 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:29.254 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 17:57:29.255 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:29.427 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "update_nwinfo" :: held 0.172s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:29.428 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:29.675 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 17:57:29.683 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 17:57:29.691 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Created VIF OpaqueRef:972c6842-ab22-c4a8-6ce6-aff028912f07, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 17:57:29.692 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:29.994 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 17:57:36.747 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.35 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:37.505 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 17:57:37.535 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:40.435 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 17:57:40.436 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 17:57:40.436 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:40.444 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "xenstore-4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:40.445 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:40.818 DEBUG nova.virt.xenapi.vmops [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:57:43.564 DEBUG nova.compute.manager [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 17:57:46.802 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:46.929 DEBUG oslo_concurrency.lockutils [req-4047e9a3-15f1-4d8c-adbe-3c5cf65b6288 tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "_locked_do_build_and_run_instance" :: held 31.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:46.930 13318 DEBUG oslo_concurrency.lockutils [-] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "query_driver_power_state_and_sync" :: waited 20.352s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:57:46.930 13318 INFO nova.compute.manager [-] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 17:57:46.931 13318 DEBUG oslo_concurrency.lockutils [-] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:57:52.605 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:53.143 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.37 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:53.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:53.517 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:54.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:54.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:57:54.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:57:54.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:57:54.603 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-4ac30f2c-18b7-42f7-8407-97f68e4ec552" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:57:54.608 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 4ac30f2c-18b7-42f7-8407-97f68e4ec552 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:57:54.934 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:77:93:0a', 'active': False, 'type': u'bridge', 'id': u'b5baf029-076b-44bc-b96d-6e7db126821a', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:57:54.967 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-4ac30f2c-18b7-42f7-8407-97f68e4ec552" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:57:54.968 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 17:57:54.969 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:57:56.795 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:03.969 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:03.969 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:58:03.970 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.55 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:04.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:04.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:06.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:06.776 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:07.064 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.04 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:07.776 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:07.831 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:58:07.832 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:58:08.116 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:58:08.117 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:58:09.042 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.925s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:58:09.301 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:58:09.302 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:58:09.302 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=14GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:58:09.303 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:58:09.518 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 17:58:09.519 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 17:58:09.858 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:58:09.858 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.555s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:58:09.860 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:12.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:12.588 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:15.795 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.32 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:18.596 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:18.597 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 35.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:25.376 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.74 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:33.094 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:58:33.095 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:58:33.096 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:58:33.099 INFO nova.compute.manager [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Terminating instance 2015-08-07 17:58:33.107 INFO nova.virt.xenapi.vmops [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Destroying VM 2015-08-07 17:58:33.130 DEBUG nova.virt.xenapi.vm_utils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 17:58:35.276 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:37.012 DEBUG nova.virt.xenapi.vmops [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 17:58:37.032 DEBUG nova.virt.xenapi.vm_utils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] VDI e799030f-1ebc-4b19-b94e-4f9caaa73dd2 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:58:37.061 DEBUG nova.virt.xenapi.vm_utils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] VDI 36f5769d-53e1-47ef-aa62-0a81cd1a7ac5 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 17:58:38.580 DEBUG nova.virt.xenapi.vmops [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 17:58:38.595 DEBUG nova.virt.xenapi.vm_utils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 17:58:38.596 DEBUG nova.compute.manager [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 17:58:41.664 DEBUG nova.compute.manager [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:57:15Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=83,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=4ac30f2c-18b7-42f7-8407-97f68e4ec552,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:57:16Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 17:58:41.908 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:58:41.909 DEBUG nova.objects.instance [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lazy-loading `numa_topology' on Instance uuid 4ac30f2c-18b7-42f7-8407-97f68e4ec552 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:58:42.058 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "compute_resources" released by "update_usage" :: held 0.150s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:58:42.460 DEBUG oslo_concurrency.lockutils [req-5cffdb2d-0a89-4236-89f3-7ab02651200f tempest-VolumesV1ActionsTest-310692007 tempest-VolumesV1ActionsTest-1175605846] Lock "4ac30f2c-18b7-42f7-8407-97f68e4ec552" released by "do_terminate_instance" :: held 9.366s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:58:45.241 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:54.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:54.517 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:54.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:54.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:54.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:55.205 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:58:55.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:58:55.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:58:55.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:58:55.576 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:58:55.577 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:04.579 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:04.589 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:05.218 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:05.534 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:05.551 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 17:59:05.552 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:07.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:07.565 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 17:59:07.565 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 17:59:07.778 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:07.779 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 17:59:08.222 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.444s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:08.440 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 17:59:08.441 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 17:59:08.441 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=13GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 17:59:08.442 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:08.611 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 17:59:08.611 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 17:59:08.764 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 17:59:08.765 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.323s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:08.765 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:08.766 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 12.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:10.986 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:11.083 INFO nova.compute.manager [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Starting instance... 2015-08-07 17:59:11.476 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:11.477 DEBUG nova.compute.resource_tracker [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 17:59:11.486 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 17:59:11.487 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 17:59:11.487 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 17:59:11.488 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Total disk: 27 GB, used: 0.00 GB 2015-08-07 17:59:11.488 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] disk limit not specified, defaulting to unlimited 2015-08-07 17:59:11.525 DEBUG nova.compute.resources.vcpu [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 17:59:11.532 DEBUG nova.compute.resources.vcpu [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 17:59:11.533 INFO nova.compute.claims [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Claim successful 2015-08-07 17:59:12.555 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" released by "instance_claim" :: held 1.079s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:12.951 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:13.131 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" released by "update_usage" :: held 0.180s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:13.134 DEBUG nova.compute.utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 17:59:13.142 13318 DEBUG nova.compute.manager [-] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 17:59:13.144 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-80959541-071e-4ff3-a3d8-1128f1c8c5be" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 17:59:14.034 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 17:59:14.048 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 17:59:14.049 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:59:14.521 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 17:59:14.538 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:15.185 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:16.253 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Cloned VDI OpaqueRef:50bfe0e2-56f1-a1f2-6071-2b10046de2a5 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 17:59:16.886 13318 DEBUG nova.network.base_api [-] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1d:bc:fc', 'active': False, 'type': u'bridge', 'id': u'5db5e76e-d799-49e6-a10e-6110769e1550', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 17:59:16.922 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-80959541-071e-4ff3-a3d8-1128f1c8c5be" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 17:59:16.922 13318 DEBUG nova.compute.manager [-] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1d:bc:fc', 'active': False, 'type': u'bridge', 'id': u'5db5e76e-d799-49e6-a10e-6110769e1550', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 17:59:17.074 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.536s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:17.075 INFO nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Image creation data, cacheable: True, downloaded: False duration: 2.55 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 17:59:18.132 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:59:18.425 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:59:18.809 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 17:59:18.835 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 17:59:18.835 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 17:59:19.163 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Creating disk-type VBD for VM OpaqueRef:64170bed-5918-bc35-f68e-23640ca22996, VDI OpaqueRef:50bfe0e2-56f1-a1f2-6071-2b10046de2a5 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:59:19.175 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Created VBD OpaqueRef:523f2613-d8b3-55f4-1fe3-75282fc691a2 for VM OpaqueRef:64170bed-5918-bc35-f68e-23640ca22996, VDI OpaqueRef:50bfe0e2-56f1-a1f2-6071-2b10046de2a5. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:59:20.434 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Created VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 17:59:20.441 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 17:59:20.454 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Created VBD OpaqueRef:31216a8c-0940-3d7a-9cf9-f0634bd9cf0d for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 17:59:20.455 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Plugging VBD OpaqueRef:31216a8c-0940-3d7a-9cf9-f0634bd9cf0d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 17:59:20.456 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 17:59:20.768 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:20.769 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 33.75 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:22.150 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.695s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 17:59:22.151 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Plugging VBD OpaqueRef:31216a8c-0940-3d7a-9cf9-f0634bd9cf0d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 17:59:22.155 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] VBD OpaqueRef:31216a8c-0940-3d7a-9cf9-f0634bd9cf0d plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 17:59:22.253 WARNING nova.virt.configdrive [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 17:59:22.254 DEBUG nova.objects.instance [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lazy-loading `ec2_ids' on Instance uuid 80959541-071e-4ff3-a3d8-1128f1c8c5be obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 17:59:22.294 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Running cmd (subprocess): genisoimage -o /tmp/tmpwRozj2/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpvFLq81 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:59:22.461 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] CMD "genisoimage -o /tmp/tmpwRozj2/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpvFLq81" returned: 0 in 0.167s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:59:22.467 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpwRozj2/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:59:34.129 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpwRozj2/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 11.662s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 17:59:34.132 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 17:59:54.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:54.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:54.531 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:56.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:56.518 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 17:59:56.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 17:59:56.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 17:59:56.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 17:59:56.597 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 17:59:56.597 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 17:59:56.598 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:05.599 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:05.600 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:00:05.600 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:06.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:06.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:07.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:07.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:09.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:09.570 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:00:09.571 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:00:09.810 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:09.811 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:00:10.731 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.921s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:10.990 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:00:10.990 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:00:10.991 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:00:10.992 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:11.230 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:00:11.231 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:00:11.498 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 36.38 sec 2015-08-07 18:00:11.499 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:11.552 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:00:11.553 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.561s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:11.554 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:11.580 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:12.069 DEBUG oslo_concurrency.processutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 37.937s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:00:12.070 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Destroying VBD for VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:00:12.071 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:13.014 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.943s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:13.029 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Destroying VBD for VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:00:13.033 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Creating disk-type VBD for VM OpaqueRef:64170bed-5918-bc35-f68e-23640ca22996, VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:00:13.042 DEBUG nova.virt.xenapi.vm_utils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Created VBD OpaqueRef:ec78fb10-3364-1e48-bcc6-513a07e28eef for VM OpaqueRef:64170bed-5918-bc35-f68e-23640ca22996, VDI OpaqueRef:8e715b06-852b-3f4d-b6df-75d9386cc095. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:00:13.043 DEBUG nova.objects.instance [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lazy-loading `pci_devices' on Instance uuid 80959541-071e-4ff3-a3d8-1128f1c8c5be obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:00:13.214 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:13.810 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:13.811 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:13.812 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:13.831 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "store_auto_disk_config" :: held 0.019s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:13.832 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Injecting hostname (tempest-volumesv2actionstest-instance-260706124) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:00:13.832 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:13.843 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "update_hostname" :: held 0.010s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:13.843 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:00:13.844 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:14.046 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "update_nwinfo" :: held 0.201s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:14.046 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:14.334 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:00:14.342 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:00:14.350 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Created VIF OpaqueRef:24a3d589-010f-a501-e39b-42618f7b8508, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:00:14.351 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:14.889 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:00:16.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:16.583 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:21.590 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:21.591 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 33.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:21.994 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:00:22.618 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:22.911 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.59 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:23.159 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:00:23.159 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:00:23.160 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:00:23.166 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "xenstore-80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:23.166 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:25.665 DEBUG nova.virt.xenapi.vmops [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:00:26.040 DEBUG nova.compute.manager [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:00:28.442 DEBUG oslo_concurrency.lockutils [req-74ece255-e6a6-47e1-aa76-2d9d59bc9a63 tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "_locked_do_build_and_run_instance" :: held 77.456s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:00:32.885 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.62 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:43.186 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.32 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:52.046 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.46 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:55.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:55.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:55.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:56.517 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:56.518 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:00:57.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:00:57.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:00:57.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:00:57.595 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-80959541-071e-4ff3-a3d8-1128f1c8c5be" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:00:57.595 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 80959541-071e-4ff3-a3d8-1128f1c8c5be obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:00:58.046 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:1d:bc:fc', 'active': False, 'type': u'bridge', 'id': u'5db5e76e-d799-49e6-a10e-6110769e1550', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:00:58.109 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-80959541-071e-4ff3-a3d8-1128f1c8c5be" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:00:58.111 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:00:58.111 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:01.685 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:07.111 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:07.112 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:01:07.113 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.41 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:07.804 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:01:07.805 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:01:07.805 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:01:07.817 INFO nova.compute.manager [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Terminating instance 2015-08-07 18:01:07.839 INFO nova.virt.xenapi.vmops [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Destroying VM 2015-08-07 18:01:07.874 DEBUG nova.virt.xenapi.vm_utils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:01:08.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:08.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:09.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:09.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:11.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:11.573 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:01:11.574 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:01:11.938 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.57 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:12.242 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:01:12.243 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:01:13.072 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.829s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:01:13.445 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:01:13.446 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:01:13.447 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:01:13.447 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:01:13.685 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 18:01:13.687 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 18:01:14.608 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:01:14.609 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.162s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:01:14.610 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:17.912 DEBUG nova.virt.xenapi.vmops [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:01:17.933 DEBUG nova.virt.xenapi.vm_utils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] VDI 52d3934c-7093-4b69-961b-aa8a9db03b8b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:01:17.943 DEBUG nova.virt.xenapi.vm_utils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] VDI 1ddae277-0665-4b40-bb67-0b1e72982701 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:01:18.940 DEBUG nova.virt.xenapi.vmops [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:01:18.954 DEBUG nova.virt.xenapi.vm_utils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:01:18.955 DEBUG nova.compute.manager [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:01:21.635 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:21.777 DEBUG nova.compute.manager [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T17:59:10Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=84,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=80959541-071e-4ff3-a3d8-1128f1c8c5be,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T17:59:13Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:01:22.105 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" acquired by "update_usage" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:01:22.109 DEBUG nova.objects.instance [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lazy-loading `numa_topology' on Instance uuid 80959541-071e-4ff3-a3d8-1128f1c8c5be obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:01:22.235 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "compute_resources" released by "update_usage" :: held 0.130s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:01:22.804 DEBUG oslo_concurrency.lockutils [req-8e8c445d-d645-4a14-8c36-61af1e085aeb tempest-VolumesV2ActionsTest-495177293 tempest-VolumesV2ActionsTest-888667377] Lock "80959541-071e-4ff3-a3d8-1128f1c8c5be" released by "do_terminate_instance" :: held 15.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:01:24.616 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:24.618 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 13.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:31.785 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:38.558 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:38.559 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 18:01:38.782 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 2 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 18:01:38.783 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 80959541-071e-4ff3-a3d8-1128f1c8c5be] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:01:39.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4ac30f2c-18b7-42f7-8407-97f68e4ec552] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:01:40.141 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 17.37 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:42.401 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.11 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:51.655 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:57.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:57.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:57.524 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:57.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:57.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:01:58.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:01:58.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:01:58.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:01:58.614 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:01:58.615 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:01.814 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:05.530 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:05.639 INFO nova.compute.manager [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Starting instance... 2015-08-07 18:02:06.071 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:06.072 DEBUG nova.compute.resource_tracker [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:02:06.116 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:02:06.117 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 18:02:06.130 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 18:02:06.130 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:02:06.131 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] disk limit not specified, defaulting to unlimited 2015-08-07 18:02:06.252 DEBUG nova.compute.resources.vcpu [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:02:06.253 DEBUG nova.compute.resources.vcpu [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:02:06.253 INFO nova.compute.claims [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Claim successful 2015-08-07 18:02:06.632 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:06.637 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:02:06.638 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:06.891 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" released by "instance_claim" :: held 0.821s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:07.181 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:07.369 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" released by "update_usage" :: held 0.187s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:07.370 DEBUG nova.compute.utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:02:07.376 13318 DEBUG nova.compute.manager [-] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:02:07.384 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:02:08.900 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:02:08.926 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:02:08.927 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:09.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:09.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:09.897 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:02:09.963 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:10.524 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:10.525 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:11.902 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.62 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:13.422 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Cloned VDI OpaqueRef:4384d580-8d62-0370-e7f8-9954c7abb205 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:02:13.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:13.588 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:02:13.589 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:02:14.316 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:14.317 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:02:15.608 13318 DEBUG nova.network.base_api [-] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7d:67:05', 'active': False, 'type': u'bridge', 'id': u'0ed2e137-d780-4149-b4f3-900d4aa3a033', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:02:15.616 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.300s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:15.688 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:02:15.689 13318 DEBUG nova.compute.manager [-] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:7d:67:05', 'active': False, 'type': u'bridge', 'id': u'0ed2e137-d780-4149-b4f3-900d4aa3a033', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:02:15.902 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 5.939s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:15.906 INFO nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Image creation data, cacheable: True, downloaded: False duration: 6.01 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:02:16.009 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:02:16.010 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:02:16.010 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:02:16.011 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:16.293 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:02:16.294 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:02:16.431 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:02:16.432 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.421s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:16.434 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.08 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:16.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:16.587 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:17.074 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:17.424 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:17.810 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:02:17.822 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:02:17.823 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:18.248 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Creating disk-type VBD for VM OpaqueRef:0f69563f-a2d8-b1b3-0132-d4388beb8ae6, VDI OpaqueRef:4384d580-8d62-0370-e7f8-9954c7abb205 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:02:18.258 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Created VBD OpaqueRef:5c10ade3-c4cf-b4c3-cbbc-9b4751b85e07 for VM OpaqueRef:0f69563f-a2d8-b1b3-0132-d4388beb8ae6, VDI OpaqueRef:4384d580-8d62-0370-e7f8-9954c7abb205. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:02:19.066 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Created VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:02:19.077 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:02:19.092 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Created VBD OpaqueRef:ea24bc94-3330-e64a-e8b4-7d563824c670 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:02:19.104 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Plugging VBD OpaqueRef:ea24bc94-3330-e64a-e8b4-7d563824c670 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:02:19.105 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:19.964 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:20.337 INFO nova.compute.manager [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Starting instance... 2015-08-07 18:02:20.677 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:20.678 DEBUG nova.compute.resource_tracker [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:02:20.687 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:02:20.694 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 18:02:20.695 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 18:02:20.695 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:02:20.696 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] disk limit not specified, defaulting to unlimited 2015-08-07 18:02:20.740 DEBUG nova.compute.resources.vcpu [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:02:20.741 DEBUG nova.compute.resources.vcpu [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:02:20.741 INFO nova.compute.claims [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Claim successful 2015-08-07 18:02:21.337 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" released by "instance_claim" :: held 0.661s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:21.581 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:21.770 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:22.088 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" released by "update_usage" :: held 0.318s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:22.089 DEBUG nova.compute.utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:02:22.094 13318 DEBUG nova.compute.manager [-] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:02:22.095 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-914582e2-004f-4e79-af9c-a6a8290b66ea" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:02:22.595 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:22.596 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 35.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:23.094 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.989s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:23.095 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Plugging VBD OpaqueRef:ea24bc94-3330-e64a-e8b4-7d563824c670 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:02:23.099 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] VBD OpaqueRef:ea24bc94-3330-e64a-e8b4-7d563824c670 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:02:23.362 WARNING nova.virt.configdrive [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:02:23.363 DEBUG nova.objects.instance [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lazy-loading `ec2_ids' on Instance uuid 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:02:23.459 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Running cmd (subprocess): genisoimage -o /tmp/tmppcO9Fr/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIewg9R execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:02:24.051 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:02:24.095 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:02:24.097 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:24.302 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] CMD "genisoimage -o /tmp/tmppcO9Fr/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIewg9R" returned: 0 in 0.843s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:02:24.306 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmppcO9Fr/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:02:24.816 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:02:24.858 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:28.523 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Cloned VDI OpaqueRef:c5fd7f02-e8d2-e703-b9b9-8fe97c766056 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:02:28.959 13318 DEBUG nova.network.base_api [-] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:29:13:ad', 'active': False, 'type': u'bridge', 'id': u'1a5b182c-71db-4de7-a684-2ed50dc2bf86', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:02:28.991 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-914582e2-004f-4e79-af9c-a6a8290b66ea" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:02:28.992 13318 DEBUG nova.compute.manager [-] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:29:13:ad', 'active': False, 'type': u'bridge', 'id': u'1a5b182c-71db-4de7-a684-2ed50dc2bf86', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:02:31.403 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 6.545s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:31.404 INFO nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Image creation data, cacheable: True, downloaded: False duration: 6.58 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:02:32.451 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.07 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:37.092 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:37.465 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:37.745 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:02:37.757 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:02:37.759 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:38.167 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Creating disk-type VBD for VM OpaqueRef:103fed54-eae2-b2e1-4c47-785173697a04, VDI OpaqueRef:c5fd7f02-e8d2-e703-b9b9-8fe97c766056 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:02:38.175 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Created VBD OpaqueRef:9a40c34e-5707-3852-4419-c3c8d0914da7 for VM OpaqueRef:103fed54-eae2-b2e1-4c47-785173697a04, VDI OpaqueRef:c5fd7f02-e8d2-e703-b9b9-8fe97c766056. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:02:38.865 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Created VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:02:38.913 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:02:38.936 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Created VBD OpaqueRef:d93b13fa-e51d-ae6b-8716-b44665888536 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:02:38.937 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Plugging VBD OpaqueRef:d93b13fa-e51d-ae6b-8716-b44665888536 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:02:38.938 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:41.939 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:42.125 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.187s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:42.126 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Plugging VBD OpaqueRef:d93b13fa-e51d-ae6b-8716-b44665888536 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:02:42.130 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] VBD OpaqueRef:d93b13fa-e51d-ae6b-8716-b44665888536 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:02:42.247 WARNING nova.virt.configdrive [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:02:42.249 DEBUG nova.objects.instance [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lazy-loading `ec2_ids' on Instance uuid 914582e2-004f-4e79-af9c-a6a8290b66ea obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:02:42.375 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Running cmd (subprocess): genisoimage -o /tmp/tmpm_R0C4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIKOWxJ execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:02:42.587 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] CMD "genisoimage -o /tmp/tmpm_R0C4/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpIKOWxJ" returned: 0 in 0.212s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:02:42.594 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpm_R0C4/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:02:45.710 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmppcO9Fr/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 21.404s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:02:45.714 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:02:46.762 DEBUG oslo_concurrency.processutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.047s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:02:46.765 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Destroying VBD for VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:02:46.767 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:50.808 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 4.041s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:51.050 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Destroying VBD for VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:02:51.050 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Creating disk-type VBD for VM OpaqueRef:0f69563f-a2d8-b1b3-0132-d4388beb8ae6, VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:02:51.330 DEBUG nova.virt.xenapi.vm_utils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Created VBD OpaqueRef:b9a8ebf7-5a5d-1e5f-2624-fddb60c2bd1d for VM OpaqueRef:0f69563f-a2d8-b1b3-0132-d4388beb8ae6, VDI OpaqueRef:b77844d9-bb55-74fc-59a6-900b88f19bf1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:02:51.331 DEBUG nova.objects.instance [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lazy-loading `pci_devices' on Instance uuid 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:02:51.633 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:52.177 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.38 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:52.606 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:52.608 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "store_meta" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:52.609 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:52.810 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "store_auto_disk_config" :: held 0.201s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:52.811 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Injecting hostname (tempest-instance-1760018168) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:02:52.811 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:52.866 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "update_hostname" :: held 0.055s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:52.867 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:02:52.868 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:02:54.252 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "update_nwinfo" :: held 1.384s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:02:54.253 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:54.850 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:02:54.878 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:02:54.893 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Created VIF OpaqueRef:73f6d4f9-0e7e-1970-643f-3b4184db6fcd, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:02:54.894 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:02:55.436 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:02:58.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:58.519 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:58.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:58.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:58.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:02:59.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:02:59.525 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:02:59.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:02:59.595 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:02:59.596 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:02:59.597 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:02:59.598 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:03.348 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.23 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:05.016 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpm_R0C4/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 22.423s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:03:05.019 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:03:07.022 DEBUG oslo_concurrency.processutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 2.004s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:03:07.027 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Destroying VBD for VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:03:07.029 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:07.600 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:07.601 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:03:07.602 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:08.661 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.632s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:08.670 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Destroying VBD for VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:03:08.671 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Creating disk-type VBD for VM OpaqueRef:103fed54-eae2-b2e1-4c47-785173697a04, VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:03:08.686 DEBUG nova.virt.xenapi.vm_utils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Created VBD OpaqueRef:00d630cc-6682-335e-0d72-0d144bf52fcd for VM OpaqueRef:103fed54-eae2-b2e1-4c47-785173697a04, VDI OpaqueRef:93ade0b1-0c8f-d10f-4e6c-3c6657521710. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:03:08.687 DEBUG nova.objects.instance [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lazy-loading `pci_devices' on Instance uuid 914582e2-004f-4e79-af9c-a6a8290b66ea obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:03:08.887 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:09.173 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:09.174 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:09.175 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:09.204 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" released by "store_auto_disk_config" :: held 0.029s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:09.205 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Injecting hostname (tempest-instance-1535989844) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:03:09.206 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:09.216 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" released by "update_hostname" :: held 0.011s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:09.218 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:03:09.218 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:09.766 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" released by "update_nwinfo" :: held 0.548s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:09.782 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:10.138 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:03:10.145 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:03:10.155 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Created VIF OpaqueRef:80aa200b-357e-65a9-d328-bbd72e8a45d1, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:03:10.156 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:10.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:10.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:10.631 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:03:11.738 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:12.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:12.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:13.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:13.630 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:03:13.630 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:03:16.335 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:16.336 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:03:20.020 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 3.684s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:21.755 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:03:22.345 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:03:22.346 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=855MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:03:22.347 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:22.747 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:03:22.748 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=650MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:03:22.763 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:22.945 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:03:22.946 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.600s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:22.948 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 11.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:30.406 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:03:30.641 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:31.883 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.70 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:32.480 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:03:32.481 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:03:32.482 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:32.490 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "xenstore-22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:32.491 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:33.029 DEBUG nova.virt.xenapi.vmops [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:33.431 DEBUG nova.compute.manager [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:03:33.947 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:33.948 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 24.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:34.235 DEBUG oslo_concurrency.lockutils [req-4e081328-97da-4726-be0a-a3394f02cfdf tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "_locked_do_build_and_run_instance" :: held 88.705s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:35.269 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:35.270 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:35.271 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:35.273 INFO nova.compute.manager [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Terminating instance 2015-08-07 18:03:35.276 INFO nova.virt.xenapi.vmops [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Destroying VM 2015-08-07 18:03:35.309 DEBUG nova.virt.xenapi.vm_utils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:03:42.302 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.28 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:42.434 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:03:42.570 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:42.820 DEBUG nova.virt.xenapi.vmops [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:03:42.897 DEBUG nova.virt.xenapi.vm_utils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] VDI 365acefc-35cc-48a8-8a3a-3e56d5ac9a5b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:03:42.950 DEBUG nova.virt.xenapi.vm_utils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] VDI 861e4cf4-8641-4e59-81f9-cded23a75e65 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:03:43.351 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:03:43.352 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:03:43.353 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:43.366 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "xenstore-914582e2-004f-4e79-af9c-a6a8290b66ea" released by "update_hostname" :: held 0.013s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:43.366 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:43.986 DEBUG nova.virt.xenapi.vmops [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:03:44.659 DEBUG nova.compute.manager [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:03:45.277 DEBUG nova.virt.xenapi.vmops [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:03:45.291 DEBUG nova.virt.xenapi.vm_utils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:03:45.292 DEBUG nova.compute.manager [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:03:45.595 DEBUG oslo_concurrency.lockutils [req-8c9a44a8-3bf9-4d79-be70-4e9431339f25 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea" released by "_locked_do_build_and_run_instance" :: held 85.630s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:47.436 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:47.438 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:47.439 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:47.443 INFO nova.compute.manager [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Terminating instance 2015-08-07 18:03:47.445 INFO nova.virt.xenapi.vmops [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Destroying VM 2015-08-07 18:03:47.610 DEBUG nova.virt.xenapi.vm_utils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:03:50.882 DEBUG nova.compute.manager [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:02:04Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=85,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=22d8a4e4-aaa0-48f9-8c03-7dfd930df25b,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:02:07Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:03:51.228 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:03:51.229 DEBUG nova.objects.instance [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lazy-loading `numa_topology' on Instance uuid 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:03:51.633 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "compute_resources" released by "update_usage" :: held 0.404s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:52.269 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.32 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:53.049 DEBUG oslo_concurrency.lockutils [req-52be5a2c-6089-405f-92a2-da071c07e583 tempest-VolumesV1NegativeTest-1182726137 tempest-VolumesV1NegativeTest-811196696] Lock "22d8a4e4-aaa0-48f9-8c03-7dfd930df25b" released by "do_terminate_instance" :: held 17.779s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:03:57.101 DEBUG nova.virt.xenapi.vmops [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:03:57.154 DEBUG nova.virt.xenapi.vm_utils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] VDI cabc71bf-6b11-4f60-827b-38483cb1880f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:03:57.179 DEBUG nova.virt.xenapi.vm_utils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] VDI 433796dd-d33f-4d41-a2cb-5462d3a0083b is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:03:58.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:58.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:58.785 DEBUG nova.virt.xenapi.vmops [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:03:58.805 DEBUG nova.virt.xenapi.vm_utils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:03:58.805 DEBUG nova.compute.manager [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:03:59.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:59.521 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:03:59.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:59.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:03:59.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:03:59.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:03:59.664 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 18:03:59.665 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:03:59.666 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:01.653 DEBUG nova.compute.manager [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:02:19Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=86,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=914582e2-004f-4e79-af9c-a6a8290b66ea,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:02:22Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:04:01.681 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:01.949 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:04:01.951 DEBUG nova.objects.instance [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lazy-loading `numa_topology' on Instance uuid 914582e2-004f-4e79-af9c-a6a8290b66ea obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:04:02.229 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "compute_resources" released by "update_usage" :: held 0.280s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:04:02.875 DEBUG oslo_concurrency.lockutils [req-4913a883-484d-405b-9e24-8cac3e012532 tempest-VolumesV2NegativeTest-1064344268 tempest-VolumesV2NegativeTest-1543094192] Lock "914582e2-004f-4e79-af9c-a6a8290b66ea" released by "do_terminate_instance" :: held 15.440s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:04:09.668 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:09.680 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:04:09.682 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:11.717 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:12.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:12.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:13.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:13.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:14.531 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:14.583 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:04:14.583 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:04:14.942 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:04:14.943 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:04:15.422 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.480s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:04:15.724 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:04:15.724 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:04:15.725 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:04:15.725 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:04:15.926 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:04:15.927 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:04:16.045 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:04:16.045 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.320s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:04:16.047 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.47 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:16.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:16.613 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:21.672 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:26.620 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:26.621 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 32.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:31.764 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:41.722 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:52.386 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.26 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:04:59.610 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:59.611 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:04:59.611 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:04:59.612 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:04:59.876 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:04:59.876 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:00.832 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:00.833 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:01.585 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:01.586 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:06.036 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:11.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:11.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:05:11.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:12.438 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:14.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:14.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:14.526 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:15.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:15.578 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:05:15.579 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:05:16.796 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:16.797 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:05:17.436 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.640s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:18.084 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:05:18.085 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:05:18.085 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:05:18.086 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:18.305 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:05:18.305 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:05:18.442 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:05:18.442 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.356s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:18.443 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 11.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:22.717 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.66 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:24.093 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:24.242 INFO nova.compute.manager [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Starting instance... 2015-08-07 18:05:24.705 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:24.706 DEBUG nova.compute.resource_tracker [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:05:24.720 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:05:24.720 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 18:05:24.721 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 18:05:24.721 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:05:24.722 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] disk limit not specified, defaulting to unlimited 2015-08-07 18:05:24.762 DEBUG nova.compute.resources.vcpu [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:05:24.762 DEBUG nova.compute.resources.vcpu [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:05:24.768 INFO nova.compute.claims [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Claim successful 2015-08-07 18:05:25.591 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" released by "instance_claim" :: held 0.886s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:25.922 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:26.089 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" released by "update_usage" :: held 0.166s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:26.090 DEBUG nova.compute.utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:05:26.097 13318 DEBUG nova.compute.manager [-] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:05:26.098 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:05:27.444 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:05:27.459 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:05:27.460 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:27.896 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:05:27.933 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:29.455 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:29.456 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 24.07 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:30.966 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Cloned VDI OpaqueRef:f4c1f316-b834-f8b0-7c65-34dd340c7367 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:05:33.173 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.26 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:34.781 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 6.848s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:34.782 INFO nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Image creation data, cacheable: True, downloaded: False duration: 6.89 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:05:35.892 13318 DEBUG nova.network.base_api [-] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:98:96:46', 'active': False, 'type': u'bridge', 'id': u'171bebb2-68ce-4513-ad61-5b5d57da2e60', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:05:36.484 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:05:36.484 13318 DEBUG nova.compute.manager [-] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:98:96:46', 'active': False, 'type': u'bridge', 'id': u'171bebb2-68ce-4513-ad61-5b5d57da2e60', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:05:37.244 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:37.633 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:37.951 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:05:37.968 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:05:37.969 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:38.237 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Creating disk-type VBD for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:f4c1f316-b834-f8b0-7c65-34dd340c7367 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:05:38.246 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Created VBD OpaqueRef:9d7e55e0-13f3-ace5-68a6-b9b6d6f4e636 for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:f4c1f316-b834-f8b0-7c65-34dd340c7367. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:05:38.819 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Created VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:05:38.826 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:05:38.842 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Created VBD OpaqueRef:a05461f5-e268-620a-9e29-e6be5506ffa1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:05:38.842 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Plugging VBD OpaqueRef:a05461f5-e268-620a-9e29-e6be5506ffa1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:05:38.843 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:41.116 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.273s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:41.124 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Plugging VBD OpaqueRef:a05461f5-e268-620a-9e29-e6be5506ffa1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:05:41.130 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] VBD OpaqueRef:a05461f5-e268-620a-9e29-e6be5506ffa1 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:05:41.258 WARNING nova.virt.configdrive [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:05:41.259 DEBUG nova.objects.instance [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lazy-loading `ec2_ids' on Instance uuid 29af7281-35f5-4346-b22f-da966069cfed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:05:41.338 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Running cmd (subprocess): genisoimage -o /tmp/tmpUQeorw/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpztdNI7 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:05:41.584 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] CMD "genisoimage -o /tmp/tmpUQeorw/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpztdNI7" returned: 0 in 0.245s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:05:41.590 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUQeorw/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:05:42.504 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:52.664 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.77 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:53.024 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUQeorw/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 11.434s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:05:53.027 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:05:53.566 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:53.568 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 18:05:54.405 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.11 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:54.579 DEBUG oslo_concurrency.processutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.552s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:05:54.581 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Destroying VBD for VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:05:54.582 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:57.861 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 3.279s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:57.875 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Destroying VBD for VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:05:57.879 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Creating disk-type VBD for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:05:57.911 DEBUG nova.virt.xenapi.vm_utils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Created VBD OpaqueRef:0dce3c2f-56a2-f6c6-71ed-a5b957306302 for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:1508f174-5ac8-41ad-67de-442fcf6560c1. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:05:57.912 DEBUG nova.objects.instance [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lazy-loading `pci_devices' on Instance uuid 29af7281-35f5-4346-b22f-da966069cfed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:05:58.091 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:58.715 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:58.716 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:58.716 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:58.747 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" released by "store_auto_disk_config" :: held 0.031s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:58.748 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Injecting hostname (tempest-instance-137618156) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:05:58.749 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:58.774 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" released by "update_hostname" :: held 0.025s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:58.775 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:05:58.775 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:05:59.146 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" released by "update_nwinfo" :: held 0.371s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:05:59.147 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:05:59.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:05:59.519 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:05:59.780 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:05:59.791 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:05:59.811 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Created VIF OpaqueRef:7b29f610-518d-343d-bfbc-8f8f8173f6bd, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:05:59.812 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:06:00.156 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:06:01.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:01.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:01.530 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:06:01.530 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:06:01.680 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:06:01.681 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:06:01.682 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:02.786 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.67 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:03.678 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:03.679 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:12.538 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:12.539 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:06:12.539 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:12.639 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.82 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:15.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:15.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:16.518 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:16.705 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:16.730 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:16.738 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.79 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:17.534 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:18.082 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:06:18.083 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:06:21.226 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:06:21.669 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:06:23.610 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:24.516 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 3.289s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:06:25.591 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:06:25.592 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:06:25.592 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:06:25.593 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:06:25.891 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:06:25.892 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:06:26.135 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:06:26.136 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.543s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:06:26.137 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:29.373 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:06:29.415 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:06:30.513 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:06:30.514 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:06:30.514 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:06:30.520 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenstore-29af7281-35f5-4346-b22f-da966069cfed" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:06:30.521 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:06:30.948 DEBUG nova.virt.xenapi.vmops [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:06:31.206 DEBUG nova.compute.manager [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:06:31.888 DEBUG oslo_concurrency.lockutils [req-952b5223-a732-40ee-a39b-bb131065d05b tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" released by "_locked_do_build_and_run_instance" :: held 67.795s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:06:32.530 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:33.396 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" acquired by "do_reserve" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:06:33.434 DEBUG nova.compute.utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Using /dev/xvd instead of /dev/vd get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:06:33.561 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" released by "do_reserve" :: held 0.165s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:06:36.086 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" acquired by "do_attach_volume" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:06:36.088 INFO nova.compute.manager [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Attaching volume 38ab1153-6e76-4eed-9736-027426d53e6c to /dev/xvdb 2015-08-07 18:06:36.090 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] REQ: curl -g -i -X GET http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c -H "User-Agent: python-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b0b494001503d0e909aa8c3af511794cadd062cd" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:06:36.129 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:36.130 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 16.40 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:37.926 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] RESP: [200] content-length: 941 x-compute-request-id: req-1f39cfb0-e797-49a1-a81d-aa67c02c9b79 connection: keep-alive date: Fri, 07 Aug 2015 18:06:37 GMT content-type: application/json x-openstack-request-id: req-1f39cfb0-e797-49a1-a81d-aa67c02c9b79 RESP BODY: {"volume": {"attachments": [], "links": [{"href": "http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c", "rel": "self"}, {"href": "http://192.168.33.1:8776/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c", "rel": "bookmark"}], "availability_zone": "nova", "encrypted": false, "os-volume-replication:extended_status": null, "volume_type": "lvmdriver-1", "snapshot_id": null, "id": "38ab1153-6e76-4eed-9736-027426d53e6c", "size": 1, "user_id": "b2fdbaf1c445460e9af56927b6288a51", "os-vol-tenant-attr:tenant_id": "57677ee7690f4b6a896b8f91b95b5eed", "metadata": {}, "status": "attaching", "description": null, "multiattach": false, "source_volid": null, "consistencygroup_id": null, "name": "tempest-Volume-762400285", "bootable": "false", "created_at": "2015-08-07T18:04:21.000000", "os-volume-replication:driver_data": null, "replication_status": "disabled"}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:06:37.928 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b0b494001503d0e909aa8c3af511794cadd062cd" -d '{"os-initialize_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:06:42.720 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.74 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:46.737 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] RESP: [200] content-length: 449 x-compute-request-id: req-d091e6d5-b004-4eea-94c2-50a39b15ac22 connection: keep-alive date: Fri, 07 Aug 2015 18:06:46 GMT content-type: application/json x-openstack-request-id: req-d091e6d5-b004-4eea-94c2-50a39b15ac22 RESP BODY: {"connection_info": {"driver_volume_type": "iscsi", "data": {"auth_password": "65eNBBUsd2LmfUwU", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-38ab1153-6e76-4eed-9736-027426d53e6c", "target_portal": "104.130.119.114:3260", "volume_id": "38ab1153-6e76-4eed-9736-027426d53e6c", "target_lun": 1, "access_mode": "rw", "auth_username": "T9qFkFi7CUarj25DdatC", "auth_method": "CHAP"}}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:06:46.746 DEBUG nova.virt.xenapi.volume_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] (vol_id,host,port,iqn): (38ab1153-6e76-4eed-9736-027426d53e6c,104.130.119.114,3260,iqn.2010-10.org.openstack:volume-38ab1153-6e76-4eed-9736-027426d53e6c) _parse_volume_info /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:80 2015-08-07 18:06:46.750 DEBUG nova.virt.xenapi.volume_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Introducing SR tempSR-38ab1153-6e76-4eed-9736-027426d53e6c introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:119 2015-08-07 18:06:46.756 DEBUG nova.virt.xenapi.volume_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Creating PBD for SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:126 2015-08-07 18:06:46.777 DEBUG nova.virt.xenapi.volume_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Plugging SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:129 2015-08-07 18:06:52.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:52.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 18:06:52.546 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:52.654 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 2 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 18:06:52.654 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 914582e2-004f-4e79-af9c-a6a8290b66ea] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:06:53.253 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 22d8a4e4-aaa0-48f9-8c03-7dfd930df25b] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:06:53.587 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:06:59.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:06:59.520 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:01.525 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:01.526 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:07:01.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:07:01.593 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:07:01.594 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 29af7281-35f5-4346-b22f-da966069cfed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:07:02.101 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:98:96:46', 'active': False, 'type': u'bridge', 'id': u'171bebb2-68ce-4513-ad61-5b5d57da2e60', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:07:02.140 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:07:02.140 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:07:02.141 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:02.559 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:04.148 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:04.152 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.37 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:05.534 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:05.535 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:12.552 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:14.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:14.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:07:14.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:17.584 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:17.806 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:07:17.806 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:07:19.989 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:07:19.990 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:07:21.218 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.271s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:07:21.645 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:07:21.673 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:07:21.674 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:07:21.675 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:07:21.944 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 18:07:21.945 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 18:07:22.102 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:07:22.103 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.428s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:07:22.103 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:22.104 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:22.104 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:22.528 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:22.989 DEBUG nova.virt.xenapi.volumeops [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Connect volume to hypervisor: {u'access_mode': u'rw', u'target_discovered': False, u'encrypted': False, u'qos_specs': None, u'target_iqn': u'iqn.2010-10.org.openstack:volume-38ab1153-6e76-4eed-9736-027426d53e6c', u'target_portal': u'104.130.119.114:3260', u'volume_id': u'38ab1153-6e76-4eed-9736-027426d53e6c', u'target_lun': 1, u'auth_password': u'65eNBBUsd2LmfUwU', u'auth_username': u'T9qFkFi7CUarj25DdatC', u'auth_method': u'CHAP'} _connect_hypervisor_to_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:94 2015-08-07 18:07:23.035 DEBUG nova.virt.xenapi.volume_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] {'sm_config': {'LUNid': '1', 'SCSIid': '33000000100000001'}, 'managed': False, 'snapshots': [], 'allowed_operations': ['forget', 'destroy', 'copy', 'snapshot'], 'on_boot': 'persist', 'name_description': '', 'read_only': False, 'uuid': 'e55110f1-56b1-d008-537d-c3081511f123', 'storage_lock': False, 'name_label': '', 'tags': [], 'location': 'e55110f1-56b1-d008-537d-c3081511f123', 'metadata_of_pool': 'OpaqueRef:NULL', 'type': 'user', 'sharable': False, 'snapshot_time': , 'parent': 'OpaqueRef:NULL', 'missing': False, 'xenstore_data': {}, 'crash_dumps': [], 'virtual_size': '1073741824', 'is_a_snapshot': False, 'current_operations': {}, 'snapshot_of': 'OpaqueRef:NULL', 'SR': 'OpaqueRef:a0853cb2-ce76-f86c-8ecc-ef222bdca4af', 'other_config': {}, 'physical_utilisation': '0', 'allow_caching': False, 'metadata_latest': False, 'VBDs': []} introduce_vdi /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:176 2015-08-07 18:07:24.842 INFO nova.virt.xenapi.volumeops [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Connected volume (vdi_uuid): e55110f1-56b1-d008-537d-c3081511f123 2015-08-07 18:07:24.843 DEBUG nova.virt.xenapi.volumeops [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Attach_volume vdi: OpaqueRef:aa1f3783-1540-64a8-2d83-96793203598a vm: OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:114 2015-08-07 18:07:24.843 DEBUG nova.virt.xenapi.vm_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Creating disk-type VBD for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:aa1f3783-1540-64a8-2d83-96793203598a ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:07:24.859 DEBUG nova.virt.xenapi.vm_utils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Created VBD OpaqueRef:78f6dbd4-da38-2499-a2c4-171d322947a9 for VM OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d, VDI OpaqueRef:aa1f3783-1540-64a8-2d83-96793203598a. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:07:24.868 DEBUG nova.virt.xenapi.volumeops [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Plugging VBD: OpaqueRef:78f6dbd4-da38-2499-a2c4-171d322947a9 _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:124 2015-08-07 18:07:24.868 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:07:28.474 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d" released by "synchronized_plug" :: held 3.606s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:07:28.475 INFO nova.virt.xenapi.volumeops [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Dev 1 attached to instance instance-00000054 2015-08-07 18:07:32.045 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:32.046 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 13.45 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:45.499 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:07:45.907 WARNING nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] While synchronizing instance power states, found 1 instances in the database and 2 instances on the hypervisor. 2015-08-07 18:07:45.908 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 29af7281-35f5-4346-b22f-da966069cfed _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 18:07:45.909 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 14.61 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:48.384 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b0b494001503d0e909aa8c3af511794cadd062cd" -d '{"os-attach": {"instance_uuid": "29af7281-35f5-4346-b22f-da966069cfed", "mountpoint": "/dev/xvdb", "mode": "rw"}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:07:50.398 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 7.93 sec 2015-08-07 18:07:50.399 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:50.614 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.78 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:07:53.363 DEBUG keystoneclient.session [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] RESP: [202] date: Fri, 07 Aug 2015 18:07:53 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-7f082edb-ce88-4c65-9cd3-e15543b6fef6 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:07:53.643 DEBUG oslo_concurrency.lockutils [req-b1cb2d90-4a91-4f6a-94e6-6eba10017db9 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" released by "do_attach_volume" :: held 77.556s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:07:53.644 13318 DEBUG oslo_concurrency.lockutils [-] Lock "29af7281-35f5-4346-b22f-da966069cfed" acquired by "query_driver_power_state_and_sync" :: waited 7.735s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:07:54.188 13318 DEBUG oslo_concurrency.lockutils [-] Lock "29af7281-35f5-4346-b22f-da966069cfed" released by "query_driver_power_state_and_sync" :: held 0.544s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:08:00.581 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:00.581 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:00.729 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:01.561 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:01.562 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:08:01.562 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:08:01.842 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:08:01.842 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 29af7281-35f5-4346-b22f-da966069cfed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:08:04.565 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:98:96:46', 'active': False, 'type': u'bridge', 'id': u'171bebb2-68ce-4513-ad61-5b5d57da2e60', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:08:05.009 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-29af7281-35f5-4346-b22f-da966069cfed" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:08:05.105 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:08:05.107 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:07.300 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:07.301 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:07.302 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.22 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:11.575 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:14.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:14.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:08:14.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:18.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:18.532 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:18.533 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:19.566 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:19.920 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:08:19.921 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:08:21.389 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.39 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:21.506 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:08:21.507 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:08:23.030 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.524s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:08:23.464 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:08:23.465 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:08:23.465 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:08:23.466 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:08:23.741 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 18:08:23.742 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 18:08:23.890 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:08:23.891 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.425s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:08:23.891 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:24.109 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:24.917 INFO nova.compute.manager [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Detach volume 38ab1153-6e76-4eed-9736-027426d53e6c from mountpoint /dev/xvdb 2015-08-07 18:08:24.922 DEBUG nova.virt.xenapi.volumeops [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Detach_volume: instance-00000054, /dev/xvdb detach_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:134 2015-08-07 18:08:24.948 DEBUG oslo_concurrency.lockutils [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:08:27.471 DEBUG oslo_concurrency.lockutils [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "xenapi-vbd-OpaqueRef:e1dc7b63-7b60-3e17-bc3d-2e3c3986196d" released by "synchronized_unplug" :: held 2.523s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:08:27.485 DEBUG nova.virt.xenapi.volume_utils [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Forgetting SR... forget_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:237 2015-08-07 18:08:29.338 INFO nova.virt.xenapi.volumeops [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Mountpoint /dev/xvdb detached from instance instance-00000054 2015-08-07 18:08:29.340 DEBUG keystoneclient.session [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b0b494001503d0e909aa8c3af511794cadd062cd" -d '{"os-terminate_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:08:30.150 DEBUG keystoneclient.session [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] RESP: [202] date: Fri, 07 Aug 2015 18:08:30 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-6c602fb5-ace6-4f12-b320-2711a986b1a1 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:08:30.329 DEBUG keystoneclient.session [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/57677ee7690f4b6a896b8f91b95b5eed/volumes/38ab1153-6e76-4eed-9736-027426d53e6c/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}b0b494001503d0e909aa8c3af511794cadd062cd" -d '{"os-detach": {"attachment_id": null}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:08:30.857 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:33.069 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:08:33.070 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 28.45 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:34.565 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:08:34.566 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:08:34.567 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:08:34.569 INFO nova.compute.manager [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Terminating instance 2015-08-07 18:08:34.570 INFO nova.virt.xenapi.vmops [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Destroying VM 2015-08-07 18:08:34.728 DEBUG nova.virt.xenapi.vm_utils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:08:42.001 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:08:50.113 DEBUG keystoneclient.session [req-ef19662e-e60e-46de-bca3-ff87548b0642 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] RESP: [202] date: Fri, 07 Aug 2015 18:08:50 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-88c6f55b-f13f-447f-b6f3-00afbe9fdc89 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:08:51.568 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.33 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:01.605 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:01.606 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:01.606 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:09:01.607 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:09:02.760 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5333 2015-08-07 18:09:02.761 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:09:02.761 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:11.224 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:11.224 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:11.225 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.30 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:13.229 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 2.27 sec 2015-08-07 18:09:13.230 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:13.551 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:13.928 DEBUG nova.virt.xenapi.vmops [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:09:13.946 DEBUG nova.virt.xenapi.vm_utils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] VDI 83756c22-c576-4426-92eb-b6b9706a49a6 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:09:13.955 DEBUG nova.virt.xenapi.vm_utils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] VDI fcf6831f-8282-4b77-acd7-dd829f4b0a3d is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:09:14.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:14.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:09:14.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:18.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:18.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:20.207 DEBUG nova.virt.xenapi.vmops [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:09:20.327 DEBUG nova.virt.xenapi.vm_utils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:09:20.712 DEBUG nova.compute.manager [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:09:20.723 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:20.724 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:21.563 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:21.639 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:09:22.088 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:09:23.610 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.62 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:24.203 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:09:24.203 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:09:28.682 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 4.479s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:09:30.700 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:09:30.701 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:09:30.701 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:09:30.726 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:09:35.466 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:09:35.466 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:09:35.512 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:36.077 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:09:36.079 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 5.353s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:09:36.081 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:43.109 DEBUG nova.compute.manager [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] [instance: 29af7281-35f5-4346-b22f-da966069cfed] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:05:23Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=87,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=29af7281-35f5-4346-b22f-da966069cfed,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:05:26Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:09:45.104 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:09:45.105 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 17.41 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:45.352 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:09:45.402 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:09:45.403 DEBUG nova.objects.instance [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lazy-loading `numa_topology' on Instance uuid 29af7281-35f5-4346-b22f-da966069cfed obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:09:46.563 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "compute_resources" released by "update_usage" :: held 1.161s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:09:48.280 DEBUG oslo_concurrency.lockutils [req-e6d018d5-8187-40fe-9afa-f0299dad4b53 tempest-VolumesV1SnapshotTestJSON-894380077 tempest-VolumesV1SnapshotTestJSON-276236588] Lock "29af7281-35f5-4346-b22f-da966069cfed" released by "do_terminate_instance" :: held 73.714s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:09:54.213 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.19 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:03.516 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:03.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:03.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:10:03.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:10:05.039 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:10:06.479 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:10.562 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:10.563 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:10.563 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.96 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:15.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:15.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:10:15.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:16.456 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 2.93 sec 2015-08-07 18:10:16.456 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:18.830 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 7.63 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:19.523 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:19.762 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:19.765 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:19.766 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.76 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:21.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:21.576 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:10:21.577 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:10:22.035 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:10:22.035 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:10:25.042 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 3.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:10:26.774 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:10:26.777 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:10:26.778 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:10:26.780 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:10:27.162 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.31 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:27.632 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:10:27.633 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:10:28.137 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:10:28.138 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.358s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:10:28.138 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:28.139 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:37.035 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:10:38.139 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:10:38.140 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 25.38 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:00.257 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 3.57 sec 2015-08-07 18:11:00.258 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:00.518 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.74 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:03.592 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:03.592 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:03.593 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:11:03.602 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:11:03.722 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:11:03.723 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:07.658 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:07.659 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:10.554 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.71 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:11.550 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:11.551 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:17.605 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:17.606 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:11:17.607 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:20.398 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:21.526 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:21.527 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:23.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:23.639 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:11:23.640 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:11:24.435 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:11:24.435 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:11:25.576 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.141s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:11:25.802 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:11:25.802 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:11:25.803 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:11:25.803 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:11:26.002 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:11:26.008 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:11:26.131 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:11:26.132 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.329s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:11:26.135 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:26.136 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:30.420 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:36.134 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:36.135 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 19.39 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:40.542 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.72 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:50.361 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:11:55.540 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:11:55.541 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 18:11:56.700 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 1 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 18:11:56.841 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 29af7281-35f5-4346-b22f-da966069cfed] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:11:57.557 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:00.925 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.35 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:03.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:03.527 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:12:03.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:12:04.250 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:12:04.251 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:05.418 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:05.419 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.11 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:08.532 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:08.533 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:12.027 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.28 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:13.607 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:13.609 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:17.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:17.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:12:17.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:20.511 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:20.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:20.718 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:23.725 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:23.829 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:12:23.830 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:12:24.395 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:24.396 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:12:25.345 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.950s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:12:27.266 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:12:27.300 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:12:27.301 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:12:27.302 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:29.154 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:12:29.155 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:12:29.520 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:12:29.521 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 2.219s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:12:29.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:29.522 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:29.523 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 11.80 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:30.397 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:40.052 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:40.263 INFO nova.compute.manager [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Starting instance... 2015-08-07 18:12:40.428 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:40.751 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:40.752 DEBUG nova.compute.resource_tracker [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:12:40.762 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:12:40.763 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 18:12:40.763 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 18:12:40.763 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:12:40.764 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] disk limit not specified, defaulting to unlimited 2015-08-07 18:12:40.994 DEBUG nova.compute.resources.vcpu [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:12:40.994 DEBUG nova.compute.resources.vcpu [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:12:40.995 INFO nova.compute.claims [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Claim successful 2015-08-07 18:12:41.328 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:12:41.329 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 22.20 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:41.700 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" released by "instance_claim" :: held 0.948s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:12:42.217 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:42.533 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" released by "update_usage" :: held 0.316s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:12:42.534 DEBUG nova.compute.utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:12:42.541 13318 DEBUG nova.compute.manager [-] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:12:42.542 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-91cc5760-bab9-4ce6-8d1c-5acc7304e662" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:12:44.141 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:12:44.162 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:12:44.171 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:12:44.467 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:12:44.541 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:12:48.814 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Cloned VDI OpaqueRef:7e4a1e02-78cb-4efc-ed1b-7a46db17b26b from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:12:51.112 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.20 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:12:51.864 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 7.323s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:12:51.865 INFO nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Image creation data, cacheable: True, downloaded: False duration: 7.40 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:12:52.483 13318 DEBUG nova.network.base_api [-] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:28:88:fb', 'active': False, 'type': u'bridge', 'id': u'b0b4fb21-a972-4003-9323-316ac94f960c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:12:52.532 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-91cc5760-bab9-4ce6-8d1c-5acc7304e662" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:12:52.533 13318 DEBUG nova.compute.manager [-] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:28:88:fb', 'active': False, 'type': u'bridge', 'id': u'b0b4fb21-a972-4003-9323-316ac94f960c', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:12:53.171 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:12:53.789 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:12:54.701 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:12:54.726 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:12:54.727 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:12:55.212 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Creating disk-type VBD for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:7e4a1e02-78cb-4efc-ed1b-7a46db17b26b ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:12:55.241 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Created VBD OpaqueRef:13893e09-62e4-11c6-4db7-66a79a78a5de for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:7e4a1e02-78cb-4efc-ed1b-7a46db17b26b. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:12:56.724 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Created VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:12:56.729 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:12:56.742 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Created VBD OpaqueRef:f1b759d8-cc10-acab-57e5-e8207eeaaff3 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:12:56.743 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Plugging VBD OpaqueRef:f1b759d8-cc10-acab-57e5-e8207eeaaff3 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:12:56.743 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:00.043 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 3.299s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:00.043 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Plugging VBD OpaqueRef:f1b759d8-cc10-acab-57e5-e8207eeaaff3 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:13:00.048 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] VBD OpaqueRef:f1b759d8-cc10-acab-57e5-e8207eeaaff3 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:13:00.146 WARNING nova.virt.configdrive [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:13:00.147 DEBUG nova.objects.instance [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lazy-loading `ec2_ids' on Instance uuid 91cc5760-bab9-4ce6-8d1c-5acc7304e662 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:13:00.292 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Running cmd (subprocess): genisoimage -o /tmp/tmpsx8q1w/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpZcvJVb execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:13:00.867 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:00.898 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] CMD "genisoimage -o /tmp/tmpsx8q1w/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpZcvJVb" returned: 0 in 0.606s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:13:00.902 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpsx8q1w/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:13:03.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:03.532 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:13:03.570 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:13:03.985 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:13:03.986 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:13:03.987 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:05.977 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:05.978 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.55 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:10.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:10.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:11.037 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.68 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:15.326 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpsx8q1w/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 14.424s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:13:15.329 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:13:15.687 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:15.690 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:17.021 DEBUG oslo_concurrency.processutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.691s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:13:17.022 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Destroying VBD for VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:13:17.024 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:17.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:17.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:13:17.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:18.453 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.429s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:18.474 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Destroying VBD for VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:13:18.475 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Creating disk-type VBD for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:13:18.485 DEBUG nova.virt.xenapi.vm_utils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Created VBD OpaqueRef:2831dd2f-889e-67b0-36fa-e4a98ef90299 for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:99fd8082-18b8-e8a1-d37a-d44cf2676f08. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:13:18.487 DEBUG nova.objects.instance [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lazy-loading `pci_devices' on Instance uuid 91cc5760-bab9-4ce6-8d1c-5acc7304e662 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:13:18.675 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:19.031 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:19.032 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:19.032 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:19.041 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "store_auto_disk_config" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:19.042 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Injecting hostname (tempest-instance-407454135) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:13:19.043 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:19.063 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "update_hostname" :: held 0.020s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:19.064 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:13:19.065 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:20.132 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "update_nwinfo" :: held 1.067s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:20.136 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:20.800 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:13:20.826 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:13:20.875 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Created VIF OpaqueRef:e2669d53-bfea-8b07-9525-9aa3b43edb36, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:13:20.876 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:21.187 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.58 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:22.145 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:13:23.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:23.587 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:13:23.587 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:13:25.192 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:25.236 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:13:26.829 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 1.638s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:27.636 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:13:27.637 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:13:27.637 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:13:27.638 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:28.038 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:13:28.039 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:13:28.420 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:13:28.420 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.782s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:28.421 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:28.421 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:28.422 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 14.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:31.137 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.63 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:41.072 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.74 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:42.497 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:13:42.498 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 21.03 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:48.534 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:13:48.633 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:49.015 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:13:49.016 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:13:49.016 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:49.022 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenstore-91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:49.022 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:49.350 DEBUG nova.virt.xenapi.vmops [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:13:49.643 DEBUG nova.compute.manager [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:13:50.142 DEBUG oslo_concurrency.lockutils [req-bda9225e-86a8-4183-8875-f04c26dd0a12 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "_locked_do_build_and_run_instance" :: held 70.090s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:50.873 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:13:51.165 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "do_reserve" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:51.294 DEBUG nova.compute.utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Using /dev/xvd instead of /dev/vd get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:13:51.360 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "do_reserve" :: held 0.195s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:13:52.285 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "do_attach_volume" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:13:52.287 INFO nova.compute.manager [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Attaching volume ae3bc85d-1712-4fe8-bc17-3b13014010dd to /dev/xvdb 2015-08-07 18:13:52.290 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] REQ: curl -g -i -X GET http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd -H "User-Agent: python-cinderclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}9342c31b0c59cf6ea33d6187ca6b217c0186601b" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:13:53.284 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] RESP: [200] content-length: 941 x-compute-request-id: req-6c023f66-9478-4b61-9b9e-ab72a2702fa6 connection: keep-alive date: Fri, 07 Aug 2015 18:13:53 GMT content-type: application/json x-openstack-request-id: req-6c023f66-9478-4b61-9b9e-ab72a2702fa6 RESP BODY: {"volume": {"attachments": [], "links": [{"href": "http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd", "rel": "self"}, {"href": "http://192.168.33.1:8776/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd", "rel": "bookmark"}], "availability_zone": "nova", "encrypted": false, "os-volume-replication:extended_status": null, "volume_type": "lvmdriver-1", "snapshot_id": null, "id": "ae3bc85d-1712-4fe8-bc17-3b13014010dd", "size": 1, "user_id": "6ed7a1c2ae48487ba5e21dfae0866d89", "os-vol-tenant-attr:tenant_id": "89f39e033a0246f688848ba1b0d0ec87", "metadata": {}, "status": "attaching", "description": null, "multiattach": false, "source_volid": null, "consistencygroup_id": null, "name": "tempest-Volume-283556212", "bootable": "false", "created_at": "2015-08-07T18:11:35.000000", "os-volume-replication:driver_data": null, "replication_status": "disabled"}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:13:53.287 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}9342c31b0c59cf6ea33d6187ca6b217c0186601b" -d '{"os-initialize_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:13:59.476 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] RESP: [200] content-length: 449 x-compute-request-id: req-24a6dd15-221d-4d52-9860-96d393b2b914 connection: keep-alive date: Fri, 07 Aug 2015 18:13:59 GMT content-type: application/json x-openstack-request-id: req-24a6dd15-221d-4d52-9860-96d393b2b914 RESP BODY: {"connection_info": {"driver_volume_type": "iscsi", "data": {"auth_password": "dSKxcA8Y2r7xWDVv", "target_discovered": false, "encrypted": false, "qos_specs": null, "target_iqn": "iqn.2010-10.org.openstack:volume-ae3bc85d-1712-4fe8-bc17-3b13014010dd", "target_portal": "104.130.119.114:3260", "volume_id": "ae3bc85d-1712-4fe8-bc17-3b13014010dd", "target_lun": 1, "access_mode": "rw", "auth_username": "iZM7U8zETh2HnYmeeLWh", "auth_method": "CHAP"}}} _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:13:59.483 DEBUG nova.virt.xenapi.volume_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] (vol_id,host,port,iqn): (ae3bc85d-1712-4fe8-bc17-3b13014010dd,104.130.119.114,3260,iqn.2010-10.org.openstack:volume-ae3bc85d-1712-4fe8-bc17-3b13014010dd) _parse_volume_info /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:80 2015-08-07 18:13:59.495 DEBUG nova.virt.xenapi.volume_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Introducing SR tempSR-ae3bc85d-1712-4fe8-bc17-3b13014010dd introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:119 2015-08-07 18:13:59.504 DEBUG nova.virt.xenapi.volume_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Creating PBD for SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:126 2015-08-07 18:13:59.519 DEBUG nova.virt.xenapi.volume_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Plugging SR introduce_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:129 2015-08-07 18:14:00.892 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:03.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:03.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:14:03.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:14:03.590 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-91cc5760-bab9-4ce6-8d1c-5acc7304e662" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:14:03.607 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 91cc5760-bab9-4ce6-8d1c-5acc7304e662 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:14:04.084 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:28:88:fb', 'active': False, 'type': u'bridge', 'id': u'b0b4fb21-a972-4003-9323-316ac94f960c', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:14:04.118 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-91cc5760-bab9-4ce6-8d1c-5acc7304e662" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:14:04.118 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:14:04.119 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:08.127 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:08.130 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.40 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:10.868 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:11.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:11.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:15.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:15.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:17.727 DEBUG nova.virt.xenapi.volumeops [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Connect volume to hypervisor: {u'access_mode': u'rw', u'target_discovered': False, u'encrypted': False, u'qos_specs': None, u'target_iqn': u'iqn.2010-10.org.openstack:volume-ae3bc85d-1712-4fe8-bc17-3b13014010dd', u'target_portal': u'104.130.119.114:3260', u'volume_id': u'ae3bc85d-1712-4fe8-bc17-3b13014010dd', u'target_lun': 1, u'auth_password': u'dSKxcA8Y2r7xWDVv', u'auth_username': u'iZM7U8zETh2HnYmeeLWh', u'auth_method': u'CHAP'} _connect_hypervisor_to_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:94 2015-08-07 18:14:17.748 DEBUG nova.virt.xenapi.volume_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] {'sm_config': {'LUNid': '1', 'SCSIid': '33000000100000001'}, 'managed': False, 'snapshots': [], 'allowed_operations': ['forget', 'destroy', 'copy', 'snapshot'], 'on_boot': 'persist', 'name_description': '', 'read_only': False, 'uuid': 'e55110f1-56b1-d008-537d-c3081511f123', 'storage_lock': False, 'name_label': '', 'tags': [], 'location': 'e55110f1-56b1-d008-537d-c3081511f123', 'metadata_of_pool': 'OpaqueRef:NULL', 'type': 'user', 'sharable': False, 'snapshot_time': , 'parent': 'OpaqueRef:NULL', 'missing': False, 'xenstore_data': {}, 'crash_dumps': [], 'virtual_size': '1073741824', 'is_a_snapshot': False, 'current_operations': {}, 'snapshot_of': 'OpaqueRef:NULL', 'SR': 'OpaqueRef:f9dbb390-675d-68ac-dc61-e2be3a0243c7', 'other_config': {}, 'physical_utilisation': '0', 'allow_caching': False, 'metadata_latest': False, 'VBDs': []} introduce_vdi /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:176 2015-08-07 18:14:18.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:18.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:14:18.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:18.536 INFO nova.virt.xenapi.volumeops [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Connected volume (vdi_uuid): e55110f1-56b1-d008-537d-c3081511f123 2015-08-07 18:14:18.537 DEBUG nova.virt.xenapi.volumeops [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Attach_volume vdi: OpaqueRef:e1c8108b-9655-3c26-e704-330e728ee0d9 vm: OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15 _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:114 2015-08-07 18:14:18.537 DEBUG nova.virt.xenapi.vm_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Creating disk-type VBD for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:e1c8108b-9655-3c26-e704-330e728ee0d9 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:14:18.558 DEBUG nova.virt.xenapi.vm_utils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Created VBD OpaqueRef:6db22c9a-9a95-355a-93a7-4add98d68f74 for VM OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15, VDI OpaqueRef:e1c8108b-9655-3c26-e704-330e728ee0d9. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:14:18.580 DEBUG nova.virt.xenapi.volumeops [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Plugging VBD: OpaqueRef:6db22c9a-9a95-355a-93a7-4add98d68f74 _attach_volume_to_vm /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:124 2015-08-07 18:14:18.581 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:20.224 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15" released by "synchronized_plug" :: held 1.643s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:20.225 INFO nova.virt.xenapi.volumeops [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Dev 1 attached to instance instance-00000055 2015-08-07 18:14:20.317 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}9342c31b0c59cf6ea33d6187ca6b217c0186601b" -d '{"os-attach": {"instance_uuid": "91cc5760-bab9-4ce6-8d1c-5acc7304e662", "mountpoint": "/dev/xvdb", "mode": "rw"}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:14:20.904 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:23.031 DEBUG keystoneclient.session [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] RESP: [202] date: Fri, 07 Aug 2015 18:14:23 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-7a8ed1cd-019d-4655-9ec1-9a3eddc8ec4e _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:14:23.108 DEBUG oslo_concurrency.lockutils [req-69bc819d-99da-46b8-a9e3-0ddb637917e6 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "do_attach_volume" :: held 30.823s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:23.519 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:23.599 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:24.607 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:24.645 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:14:24.646 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:14:24.883 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:24.884 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:14:25.436 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.553s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:25.869 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:14:25.870 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:14:25.870 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:14:25.871 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:26.221 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 18:14:26.222 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 18:14:26.663 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:14:26.663 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.792s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:26.664 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:26.664 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:27.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:27.531 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:30.942 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:34.748 INFO nova.compute.manager [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Detach volume ae3bc85d-1712-4fe8-bc17-3b13014010dd from mountpoint /dev/xvdb 2015-08-07 18:14:34.752 DEBUG nova.virt.xenapi.volumeops [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Detach_volume: instance-00000055, /dev/xvdb detach_volume /opt/stack/new/nova/nova/virt/xenapi/volumeops.py:134 2015-08-07 18:14:34.775 DEBUG oslo_concurrency.lockutils [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:36.164 DEBUG oslo_concurrency.lockutils [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "xenapi-vbd-OpaqueRef:b35f7e70-1205-7996-9ace-4e3fcfda6f15" released by "synchronized_unplug" :: held 1.389s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:36.197 DEBUG nova.virt.xenapi.volume_utils [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Forgetting SR... forget_sr /opt/stack/new/nova/nova/virt/xenapi/volume_utils.py:237 2015-08-07 18:14:37.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:14:37.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 27.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:37.535 INFO nova.virt.xenapi.volumeops [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Mountpoint /dev/xvdb detached from instance instance-00000055 2015-08-07 18:14:37.536 DEBUG keystoneclient.session [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}9342c31b0c59cf6ea33d6187ca6b217c0186601b" -d '{"os-terminate_connection": {"connector": {"ip": "192.168.33.2", "initiator": "iqn.2015-08.com.example:3b035c79", "host": "localhost.localdomain"}}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:14:38.468 DEBUG keystoneclient.session [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] RESP: [202] date: Fri, 07 Aug 2015 18:14:38 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-d0de5b8f-19c1-4919-9317-30a8e2f9529e _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:14:38.568 DEBUG keystoneclient.session [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] REQ: curl -g -i -X POST http://192.168.33.1:8776/v2/89f39e033a0246f688848ba1b0d0ec87/volumes/ae3bc85d-1712-4fe8-bc17-3b13014010dd/action -H "User-Agent: python-cinderclient" -H "Content-Type: application/json" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}9342c31b0c59cf6ea33d6187ca6b217c0186601b" -d '{"os-detach": {"attachment_id": null}}' _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195 2015-08-07 18:14:40.299 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:40.300 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:40.301 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:14:40.330 INFO nova.compute.manager [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Terminating instance 2015-08-07 18:14:40.332 INFO nova.virt.xenapi.vmops [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Destroying VM 2015-08-07 18:14:40.354 DEBUG nova.virt.xenapi.vm_utils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:14:40.919 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:46.250 DEBUG keystoneclient.session [req-43e426ab-2997-44ca-80cb-5b4eb54dc582 tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] RESP: [202] date: Fri, 07 Aug 2015 18:14:46 GMT connection: keep-alive content-type: text/html; charset=UTF-8 content-length: 0 x-openstack-request-id: req-f2115f12-1e3e-4dd9-a120-dec2eca54e99 _http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224 2015-08-07 18:14:51.479 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.39 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:14:55.660 DEBUG nova.virt.xenapi.vmops [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:14:55.689 DEBUG nova.virt.xenapi.vm_utils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] VDI 217255ae-03b3-4f75-b486-e030d1031916 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:14:55.728 DEBUG nova.virt.xenapi.vm_utils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] VDI 3e30b9eb-70b2-4e8f-920f-442ef6e2fc2f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:14:56.759 DEBUG nova.virt.xenapi.vmops [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:14:56.774 DEBUG nova.virt.xenapi.vm_utils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:14:56.776 DEBUG nova.compute.manager [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:14:59.059 DEBUG nova.compute.manager [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:12:38Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=89,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=91cc5760-bab9-4ce6-8d1c-5acc7304e662,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:12:42Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:14:59.280 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:14:59.283 DEBUG nova.objects.instance [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lazy-loading `numa_topology' on Instance uuid 91cc5760-bab9-4ce6-8d1c-5acc7304e662 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:14:59.409 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "compute_resources" released by "update_usage" :: held 0.128s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:00.092 DEBUG oslo_concurrency.lockutils [req-9b25afd2-09d0-4bd5-9c9b-b7d4dd246feb tempest-VolumesV2SnapshotTestJSON-526069488 tempest-VolumesV2SnapshotTestJSON-524980494] Lock "91cc5760-bab9-4ce6-8d1c-5acc7304e662" released by "do_terminate_instance" :: held 19.793s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:01.536 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.33 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:04.608 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:04.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:15:04.609 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:15:05.798 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:15:05.799 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:09.139 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:09.140 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.39 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:13.636 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:13.637 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:14.849 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 6.09 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:15.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:15.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:15.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:19.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:19.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:15:19.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:21.158 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:22.873 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:22.943 INFO nova.compute.manager [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Starting instance... 2015-08-07 18:15:23.392 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:23.393 DEBUG nova.compute.resource_tracker [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:15:23.405 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:15:23.406 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 18:15:23.406 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 18:15:23.407 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:15:23.407 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] disk limit not specified, defaulting to unlimited 2015-08-07 18:15:23.440 DEBUG nova.compute.resources.vcpu [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:15:23.441 DEBUG nova.compute.resources.vcpu [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:15:23.441 INFO nova.compute.claims [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Claim successful 2015-08-07 18:15:23.802 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "instance_claim" :: held 0.409s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:24.206 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:24.341 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "update_usage" :: held 0.135s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:24.342 DEBUG nova.compute.utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:15:24.346 13318 DEBUG nova.compute.manager [-] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:15:24.348 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:15:25.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:25.533 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:15:25.566 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:15:25.566 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:15:25.596 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:15:25.597 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:15:25.922 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:15:25.982 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:26.011 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:26.012 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:15:28.032 13318 DEBUG nova.network.base_api [-] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:15:28.086 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:15:28.086 13318 DEBUG nova.compute.manager [-] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:15:28.128 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Cloned VDI OpaqueRef:8079a8fe-0552-edf5-7fa7-9116be4320c3 from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:15:29.372 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 3.390s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:29.373 INFO nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Image creation data, cacheable: True, downloaded: False duration: 3.45 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:15:30.204 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:15:31.659 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 5.648s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:32.100 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:15:32.101 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:15:32.101 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:15:32.102 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:32.328 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:15:32.329 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:15:48.888 13318 WARNING oslo_service.loopingcall [-] Function > run outlasted interval by 7.81 sec 2015-08-07 18:15:48.889 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 0.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:48.916 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:15:48.917 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 16.815s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:48.917 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:48.918 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:15:48.918 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 12.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:49.003 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:15:49.028 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:15:49.309 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:15:49.327 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:15:49.328 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:15:49.643 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:588655bc-f412-6e46-e738-34f4239168c1, VDI OpaqueRef:8079a8fe-0552-edf5-7fa7-9116be4320c3 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:15:49.655 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:37959cdb-3723-c161-1950-f3e5ac2c0977 for VM OpaqueRef:588655bc-f412-6e46-e738-34f4239168c1, VDI OpaqueRef:8079a8fe-0552-edf5-7fa7-9116be4320c3. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:15:50.163 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:15:50.168 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:15:50.181 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:a5f29636-a68f-ddec-d8d9-3b523f21c8f1 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:15:50.191 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:a5f29636-a68f-ddec-d8d9-3b523f21c8f1 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:15:50.192 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:15:52.494 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.303s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:15:52.495 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:a5f29636-a68f-ddec-d8d9-3b523f21c8f1 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:15:52.501 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VBD OpaqueRef:a5f29636-a68f-ddec-d8d9-3b523f21c8f1 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:15:52.603 WARNING nova.virt.configdrive [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:15:52.610 DEBUG nova.objects.instance [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `ec2_ids' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:15:52.654 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): genisoimage -o /tmp/tmphvTK7f/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpx9T_j9 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:15:52.883 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "genisoimage -o /tmp/tmphvTK7f/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpx9T_j9" returned: 0 in 0.229s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:15:52.890 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmphvTK7f/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:15:59.986 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:00.968 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:00.969 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.56 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:05.546 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:05.547 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:16:05.547 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:16:05.699 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:16:05.700 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:16:05.700 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:07.674 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:07.675 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:10.308 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 8.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:10.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_bandwidth_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:10.529 INFO nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating bandwidth usage cache 2015-08-07 18:16:11.249 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:14.349 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:14.351 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.18 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:16.537 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:16.538 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:18.129 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmphvTK7f/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 25.239s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:16:18.132 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:16:19.296 DEBUG oslo_concurrency.processutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 1.164s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:16:19.299 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:16:19.300 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:19.520 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.45 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:19.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:19.531 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:16:19.532 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:21.243 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.943s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:21.251 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:16:21.251 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:588655bc-f412-6e46-e738-34f4239168c1, VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:16:21.268 DEBUG nova.virt.xenapi.vm_utils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:67b17b70-9910-7310-c652-6e60264b27fc for VM OpaqueRef:588655bc-f412-6e46-e738-34f4239168c1, VDI OpaqueRef:59b5f73a-414b-1105-9a22-450d75b08f97. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:16:21.269 DEBUG nova.objects.instance [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `pci_devices' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:16:21.417 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:21.799 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:21.799 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:21.800 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:21.850 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "store_auto_disk_config" :: held 0.050s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:21.851 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Injecting hostname (tempest-testserveradvancedops-1783495231) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:16:21.852 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:21.894 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "update_hostname" :: held 0.042s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:21.895 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:16:21.895 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:22.309 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "update_nwinfo" :: held 0.413s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:22.310 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:22.812 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:16:22.823 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:16:22.835 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Created VIF OpaqueRef:549101a0-511d-33eb-8c6e-2a6cb9fe7187, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:16:22.836 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:23.471 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:16:25.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:25.569 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:16:25.569 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:16:26.109 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:26.109 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:16:27.001 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.892s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:27.298 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:16:27.299 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:16:27.299 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:16:27.300 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:27.512 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:16:27.512 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:16:27.639 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:16:27.640 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.340s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:27.640 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:27.641 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:27.713 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:29.048 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:30.600 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:30.601 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:32.951 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:16:32.980 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:33.295 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:16:33.296 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:16:33.297 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:33.305 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "update_hostname" :: held 0.008s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:33.306 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:33.572 DEBUG nova.virt.xenapi.vmops [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:33.843 DEBUG nova.compute.manager [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:16:34.548 DEBUG oslo_concurrency.lockutils [req-dde6ab66-0676-43f8-b8eb-e407ab4a2aa2 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" released by "_locked_do_build_and_run_instance" :: held 71.675s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:36.263 DEBUG nova.compute.manager [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Stashing vm_state: active _prep_resize /opt/stack/new/nova/nova/compute/manager.py:3514 2015-08-07 18:16:36.582 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "resize_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:36.582 DEBUG nova.compute.resource_tracker [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Memory overhead for 128 MB instance; 6 MB resize_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:174 2015-08-07 18:16:36.591 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Attempting claim: memory 134 MB, disk 0 GB 2015-08-07 18:16:36.592 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Total memory: 8187 MB, used: 581.00 MB 2015-08-07 18:16:36.592 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] memory limit: 12280.50 MB, free: 11699.50 MB 2015-08-07 18:16:36.593 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:16:36.594 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] disk limit not specified, defaulting to unlimited 2015-08-07 18:16:36.623 DEBUG nova.compute.resources.vcpu [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Total CPUs: 8 VCPUs, used: 1.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:16:36.625 DEBUG nova.compute.resources.vcpu [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:16:36.625 INFO nova.compute.claims [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Claim successful 2015-08-07 18:16:36.688 INFO nova.compute.resource_tracker [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Updating from migration 4b91aa49-9d80-4533-a63e-04debcc5884a 2015-08-07 18:16:36.800 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "resize_claim" :: held 0.218s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:36.800 INFO nova.compute.manager [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrating 2015-08-07 18:16:36.886 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Acquired semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:16:37.059 DEBUG nova.network.base_api [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:16:37.089 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Releasing semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:16:37.457 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 0 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:37.534 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:16:37.547 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 25.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:37.759 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Starting snapshot for VM _snapshot_attached_here_impl /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:714 2015-08-07 18:16:37.806 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:37.807 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:16:38.297 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.491s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:38.310 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD 7d903684-08cf-4025-9e09-43d6f6f9f7e9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:38.337 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD 7d903684-08cf-4025-9e09-43d6f6f9f7e9 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:38.350 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD fee7cc36-2e44-486b-ba4a-d8bb59b182ea has parent 5071f77f-c3bb-4eba-870a-799c1f5f7e19 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:38.366 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD 88050801-2d71-415c-83d9-dd9349e75f0c has parent eef80aa6-f16e-4dc7-9b05-68cae1b6f304 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:38.380 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD adc68f5c-f9ed-4dfe-862f-8738fb519f84 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:38.391 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] No snapshots to remove. _delete_snapshots_in_vdi_chain /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:677 2015-08-07 18:16:39.417 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.55 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:41.813 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Parent has other children, coalesce is unlikely. _wait_for_vhd_coalesce /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2092 2015-08-07 18:16:41.826 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:41.827 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:16:42.412 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.585s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:42.424 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD 72fcc9ff-5d3c-4e6f-9320-502cb32de785 has parent e3cdb805-29a7-4312-90d1-07823d483c60 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:42.434 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VHD e3cdb805-29a7-4312-90d1-07823d483c60 has parent 4027f457-a9bb-499a-8844-79fc67f11377 _get_vhd_parent_uuid /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2006 2015-08-07 18:16:42.444 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:42.743 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrating VHD 'e3cdb805-29a7-4312-90d1-07823d483c60' with seq_num 1 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 18:16:43.171 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrating VHD '4027f457-a9bb-499a-8844-79fc67f11377' with seq_num 2 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 18:16:44.563 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrated root base vhds transfer_immutable_vhds /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1036 2015-08-07 18:16:44.564 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:44.953 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrated all base vhds. _process_ephemeral_chain_recursive /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1051 2015-08-07 18:16:44.962 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Shutting down VM (cleanly) clean_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:333 2015-08-07 18:16:49.324 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.64 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:50.942 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Migrating VHD '7d903684-08cf-4025-9e09-43d6f6f9f7e9' with seq_num 0 migrate_vhd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2559 2015-08-07 18:16:52.142 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:52.367 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:54.010 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a-events" acquired by "_clear_events" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:54.011 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:56.966 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:57.052 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Acquired semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:16:57.056 INFO nova.compute.manager [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Starting instance... 2015-08-07 18:16:57.240 DEBUG nova.network.base_api [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:16:57.282 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Releasing semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:16:57.328 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:57.330 DEBUG nova.compute.resource_tracker [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:16:57.338 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:16:57.338 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Total memory: 8187 MB, used: 715.00 MB 2015-08-07 18:16:57.339 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] memory limit: 12280.50 MB, free: 11565.50 MB 2015-08-07 18:16:57.339 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:16:57.340 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] disk limit not specified, defaulting to unlimited 2015-08-07 18:16:57.384 DEBUG nova.compute.resources.vcpu [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Total CPUs: 8 VCPUs, used: 2.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:16:57.385 DEBUG nova.compute.resources.vcpu [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:16:57.386 INFO nova.compute.claims [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Claim successful 2015-08-07 18:16:57.652 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Detected vhd format for image None determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:16:57.653 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} create_disks_step /opt/stack/new/nova/nova/virt/xenapi/vmops.py:278 2015-08-07 18:16:57.763 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" released by "instance_claim" :: held 0.435s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:58.026 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:58.196 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" released by "update_usage" :: held 0.170s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:16:58.197 DEBUG nova.compute.utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:16:58.203 13318 DEBUG nova.compute.manager [-] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:16:58.204 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:16:58.655 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:58.655 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:16:58.930 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:16:58.963 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:16:58.968 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:16:59.073 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:16:59.400 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:16:59.419 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:16:59.529 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.874s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:00.434 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:17:00.449 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:17:00.450 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:4948c829-6739-4ce5-4ecf-ef2986fdd6dc, VDI OpaqueRef:defe0957-efb3-6523-16af-bf7abb277df7 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:00.461 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:84032448-dcac-6e2c-2117-b3f38026c3d2 for VM OpaqueRef:4948c829-6739-4ce5-4ecf-ef2986fdd6dc, VDI OpaqueRef:defe0957-efb3-6523-16af-bf7abb277df7. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:01.635 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:17:01.667 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:01.767 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:6303336f-44b4-536c-8f85-c61fb5efdb63 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:01.768 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:6303336f-44b4-536c-8f85-c61fb5efdb63 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:17:01.768 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:03.135 13318 DEBUG nova.network.base_api [-] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:55:4f:9b', 'active': False, 'type': u'bridge', 'id': u'e613de7f-12d9-487c-a5f8-a8b11d9d180e', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:17:03.167 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:17:03.167 13318 DEBUG nova.compute.manager [-] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:55:4f:9b', 'active': False, 'type': u'bridge', 'id': u'e613de7f-12d9-487c-a5f8-a8b11d9d180e', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:17:03.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:03.530 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Cleaning up deleted instances _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6276 2015-08-07 18:17:03.617 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] There are 1 instances to clean _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6285 2015-08-07 18:17:03.617 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 91cc5760-bab9-4ce6-8d1c-5acc7304e662] Instance has had 0 of 5 cleanup attempts _run_pending_deletes /opt/stack/new/nova/nova/compute/manager.py:6293 2015-08-07 18:17:04.146 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.38 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:06.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:06.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:17:06.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:17:06.603 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:17:06.604 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:17:06.604 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:06.636 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 4.868s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:06.637 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:6303336f-44b4-536c-8f85-c61fb5efdb63 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:17:06.642 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VBD OpaqueRef:6303336f-44b4-536c-8f85-c61fb5efdb63 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:17:06.745 WARNING nova.virt.configdrive [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:17:06.746 DEBUG nova.objects.instance [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `ec2_ids' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:06.834 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): genisoimage -o /tmp/tmpUdzwfq/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpxQId_k execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:06.989 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Cloned VDI OpaqueRef:1c548db2-373f-8d21-f1eb-de551a293dcd from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:17:07.118 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:17:07.138 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "genisoimage -o /tmp/tmpUdzwfq/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpxQId_k" returned: 0 in 0.304s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:07.142 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUdzwfq/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:07.267 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:17:07.269 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:17:07.273 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:08.234 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 8.815s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:08.236 INFO nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Image creation data, cacheable: True, downloaded: False duration: 8.84 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:17:09.098 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:09.266 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:09.267 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.26 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:09.658 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:10.243 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:10.494 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:17:10.509 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:17:10.510 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:10.771 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Creating disk-type VBD for VM OpaqueRef:ea2087ff-0e7a-783e-5364-ee8eca407b67, VDI OpaqueRef:1c548db2-373f-8d21-f1eb-de551a293dcd ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:10.780 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Created VBD OpaqueRef:727fb0f2-a3b4-6016-d8e7-09bafebbb31e for VM OpaqueRef:ea2087ff-0e7a-783e-5364-ee8eca407b67, VDI OpaqueRef:1c548db2-373f-8d21-f1eb-de551a293dcd. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:11.299 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Created VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:17:11.306 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:11.320 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Created VBD OpaqueRef:af3852a0-aad4-983d-a18c-ce62d5c91834 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:11.320 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Plugging VBD OpaqueRef:af3852a0-aad4-983d-a18c-ce62d5c91834 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:17:11.323 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:14.006 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 2.683s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:14.007 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Plugging VBD OpaqueRef:af3852a0-aad4-983d-a18c-ce62d5c91834 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:17:14.012 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] VBD OpaqueRef:af3852a0-aad4-983d-a18c-ce62d5c91834 plugged as xvdd vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:17:14.109 WARNING nova.virt.configdrive [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:17:14.110 DEBUG nova.objects.instance [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lazy-loading `ec2_ids' on Instance uuid fb80dcfb-2f23-42c0-a68f-ac9ff1404d68 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:14.318 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Running cmd (subprocess): genisoimage -o /tmp/tmpoudvyC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpviUCBE execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:14.486 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] CMD "genisoimage -o /tmp/tmpoudvyC/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpviUCBE" returned: 0 in 0.168s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:14.494 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpoudvyC/configdrive of=/dev/xvdd oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:14.665 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:14.668 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:17.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:17.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:18.092 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpUdzwfq/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 10.951s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:18.095 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:19.069 DEBUG oslo_concurrency.processutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.974s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:19.071 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:17:19.072 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:19.081 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:19.258 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 0.186s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:19.259 INFO nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VBD OpaqueRef:6303336f-44b4-536c-8f85-c61fb5efdb63 uplug failed with "DEVICE_DETACH_REJECTED", attempt 1/11 2015-08-07 18:17:19.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:19.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:17:19.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:20.260 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:21.878 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.618s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:21.888 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:17:21.889 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:4948c829-6739-4ce5-4ecf-ef2986fdd6dc, VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:21.898 DEBUG nova.virt.xenapi.vm_utils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:f0b77e1b-0c39-90c8-2b58-3ec8e7da82de for VM OpaqueRef:4948c829-6739-4ce5-4ecf-ef2986fdd6dc, VDI OpaqueRef:7f2ea0d3-3f7c-27e1-2f22-a26a772c203d. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:21.900 DEBUG nova.objects.instance [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `pci_devices' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:22.103 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:22.104 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:22.104 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:22.122 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "store_auto_disk_config" :: held 0.018s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:22.123 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:17:22.124 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:22.428 DEBUG oslo_concurrency.lockutils [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-4b91aa49-9d80-4533-a63e-04debcc5884a" released by "update_nwinfo" :: held 0.304s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:22.428 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:17:22.438 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:17:22.449 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Created VIF OpaqueRef:301919aa-8fa0-8596-8962-f18d432c3225, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:17:22.450 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:17:24.819 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmpoudvyC/configdrive of=/dev/xvdd oflag=direct,sync" returned: 0 in 10.325s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:24.821 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:17:25.685 DEBUG oslo_concurrency.processutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.864s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:17:25.689 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Destroying VBD for VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:17:25.690 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:26.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:26.615 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:17:26.616 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:17:27.090 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:27.091 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:17:27.360 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.670s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:27.374 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Destroying VBD for VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:17:27.375 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Creating disk-type VBD for VM OpaqueRef:ea2087ff-0e7a-783e-5364-ee8eca407b67, VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:17:27.387 DEBUG nova.virt.xenapi.vm_utils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Created VBD OpaqueRef:b017e53d-04eb-acbe-146a-abc059c5fd8c for VM OpaqueRef:ea2087ff-0e7a-783e-5364-ee8eca407b67, VDI OpaqueRef:46a14eca-86b7-a6db-d5b5-ad409d232fb4. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:17:27.388 DEBUG nova.objects.instance [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lazy-loading `pci_devices' on Instance uuid fb80dcfb-2f23-42c0-a68f-ac9ff1404d68 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:27.535 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:27.926 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:27.927 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:27.927 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "store_auto_disk_config" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:27.941 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:27.942 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Injecting hostname (tempest-testserverbasicops-1001009021) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:17:27.943 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:27.963 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "update_hostname" :: held 0.020s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:27.963 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:17:27.964 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:27.990 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.900s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:28.341 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "update_nwinfo" :: held 0.377s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:28.342 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:28.392 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:17:28.398 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:17:28.399 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=858MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:17:28.400 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:28.683 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:17:28.702 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:17:28.711 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Created VIF OpaqueRef:a6b0fc58-55ea-ed20-b2d5-febd119f0c53, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:17:28.712 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:28.895 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating from migration 4b91aa49-9d80-4533-a63e-04debcc5884a 2015-08-07 18:17:28.897 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `old_flavor' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:29.024 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:17:29.164 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:29.228 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:17:29.228 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=784MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:17:29.399 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:17:29.400 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 1.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:29.401 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:29.402 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.13 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:32.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:32.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:34.259 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:17:34.289 DEBUG nova.virt.xenapi.vmops [req-59105190-d04e-4c73-9c8c-835b29b014b5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:36.039 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "do_confirm_resize" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:36.040 DEBUG nova.compute.manager [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Going to confirm migration 8 do_confirm_resize /opt/stack/new/nova/nova/compute/manager.py:3243 2015-08-07 18:17:37.749 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Acquired semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:17:38.022 DEBUG nova.network.base_api [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:70:61:ff', 'active': False, 'type': u'bridge', 'id': u'ad6a5ba7-38fa-4095-9614-5ff5d309d974', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:17:38.053 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Releasing semaphore "refresh_cache-4b91aa49-9d80-4533-a63e-04debcc5884a" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:17:38.063 WARNING nova.virt.xenapi.vm_utils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] VM already halted, skipping shutdown... 2015-08-07 18:17:38.093 DEBUG nova.virt.xenapi.vmops [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:17:38.105 DEBUG nova.virt.xenapi.vm_utils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI 7d903684-08cf-4025-9e09-43d6f6f9f7e9 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:38.119 DEBUG nova.virt.xenapi.vm_utils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI 5c2486fe-665b-4afe-b33d-11bfaac77ab3 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:38.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:17:38.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 27.97 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:39.182 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:39.353 DEBUG nova.virt.xenapi.vmops [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:17:39.367 DEBUG nova.virt.xenapi.vm_utils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:17:39.454 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "drop_move_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:39.541 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "drop_move_claim" :: held 0.086s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:39.873 DEBUG oslo_concurrency.lockutils [req-ff0c8230-124f-401d-bc81-6e6cc1e9528d tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" released by "do_confirm_resize" :: held 3.834s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:40.227 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:17:40.277 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:40.620 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:17:40.621 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:17:40.622 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:40.628 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "xenstore-fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "update_hostname" :: held 0.006s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:40.628 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:40.926 DEBUG nova.virt.xenapi.vmops [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:41.276 DEBUG nova.compute.manager [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:17:41.712 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:41.713 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:41.713 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a-events" released by "_clear_events" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:41.715 INFO nova.compute.manager [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Terminating instance 2015-08-07 18:17:41.718 INFO nova.virt.xenapi.vmops [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Destroying VM 2015-08-07 18:17:41.737 DEBUG nova.virt.xenapi.vm_utils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:17:41.967 DEBUG oslo_concurrency.lockutils [req-2aa6191b-900a-4bd4-a687-84b6c3ad55be tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "_locked_do_build_and_run_instance" :: held 45.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:43.703 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:43.704 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:43.704 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68-events" released by "_clear_events" :: held 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:43.706 INFO nova.compute.manager [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Terminating instance 2015-08-07 18:17:43.707 INFO nova.virt.xenapi.vmops [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Destroying VM 2015-08-07 18:17:43.762 DEBUG nova.virt.xenapi.vm_utils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:17:45.907 DEBUG nova.virt.xenapi.vmops [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:17:45.929 DEBUG nova.virt.xenapi.vm_utils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI 0ccea983-0e12-4d3f-a524-87be1b8b472e is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:45.956 DEBUG nova.virt.xenapi.vm_utils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI 7a4f5599-0c75-4dcc-a07f-5c5726eb1029 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:47.239 DEBUG nova.virt.xenapi.vmops [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:17:47.253 DEBUG nova.virt.xenapi.vm_utils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:17:47.254 DEBUG nova.compute.manager [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:17:48.587 DEBUG nova.virt.xenapi.vmops [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:17:48.603 DEBUG nova.virt.xenapi.vm_utils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] VDI 43a53cc5-0675-4262-b6ea-45a867b92b14 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:48.611 DEBUG nova.virt.xenapi.vm_utils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] VDI 08c40040-b5e9-4fb8-aba3-699a4f7e865f is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:17:49.105 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:17:49.317 DEBUG nova.compute.manager [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 4b91aa49-9d80-4533-a63e-04debcc5884a] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:15:22Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=91,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=4b91aa49-9d80-4533-a63e-04debcc5884a,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:15:24Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:17:49.475 DEBUG nova.virt.xenapi.vmops [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:17:49.502 DEBUG nova.virt.xenapi.vm_utils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:17:49.503 DEBUG nova.compute.manager [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:17:49.524 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:49.525 DEBUG nova.objects.instance [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `numa_topology' on Instance uuid 4b91aa49-9d80-4533-a63e-04debcc5884a obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:49.611 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "update_usage" :: held 0.087s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:49.938 DEBUG oslo_concurrency.lockutils [req-f76bbbfc-644f-45df-aa12-ea463d0aa0c5 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "4b91aa49-9d80-4533-a63e-04debcc5884a" released by "do_terminate_instance" :: held 8.227s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:52.572 DEBUG nova.compute.manager [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] [instance: fb80dcfb-2f23-42c0-a68f-ac9ff1404d68] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:16:56Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=92,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=fb80dcfb-2f23-42c0-a68f-ac9ff1404d68,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:16:58Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:17:52.817 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:52.818 DEBUG nova.objects.instance [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lazy-loading `numa_topology' on Instance uuid fb80dcfb-2f23-42c0-a68f-ac9ff1404d68 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:17:52.935 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "compute_resources" released by "update_usage" :: held 0.118s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:53.296 DEBUG oslo_concurrency.lockutils [req-750fe103-5d03-4d06-be0c-ad575f214f6b tempest-TestServerBasicOps-171890207 tempest-TestServerBasicOps-1821359557] Lock "fb80dcfb-2f23-42c0-a68f-ac9ff1404d68" released by "do_terminate_instance" :: held 9.594s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:56.168 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624" acquired by "_locked_do_build_and_run_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:56.424 INFO nova.compute.manager [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Starting instance... 2015-08-07 18:17:56.714 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "instance_claim" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:56.715 DEBUG nova.compute.resource_tracker [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Memory overhead for 64 MB instance; 5 MB instance_claim /opt/stack/new/nova/nova/compute/resource_tracker.py:126 2015-08-07 18:17:56.722 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Attempting claim: memory 69 MB, disk 0 GB 2015-08-07 18:17:56.723 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Total memory: 8187 MB, used: 512.00 MB 2015-08-07 18:17:56.723 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] memory limit: 12280.50 MB, free: 11768.50 MB 2015-08-07 18:17:56.724 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Total disk: 27 GB, used: 0.00 GB 2015-08-07 18:17:56.724 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] disk limit not specified, defaulting to unlimited 2015-08-07 18:17:56.752 DEBUG nova.compute.resources.vcpu [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Total CPUs: 8 VCPUs, used: 0.00 VCPUs test /opt/stack/new/nova/nova/compute/resources/vcpu.py:52 2015-08-07 18:17:56.752 DEBUG nova.compute.resources.vcpu [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CPUs limit not specified, defaulting to unlimited test /opt/stack/new/nova/nova/compute/resources/vcpu.py:56 2015-08-07 18:17:56.753 INFO nova.compute.claims [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Claim successful 2015-08-07 18:17:57.164 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "instance_claim" :: held 0.450s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:57.446 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:57.576 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "update_usage" :: held 0.130s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:17:57.578 DEBUG nova.compute.utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Using /dev/xvd instead of None get_next_device_name /opt/stack/new/nova/nova/compute/utils.py:163 2015-08-07 18:17:57.582 13318 DEBUG nova.compute.manager [-] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Allocating IP information in the background. _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1513 2015-08-07 18:17:57.583 13318 DEBUG oslo_concurrency.lockutils [-] Acquired semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:17:58.398 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Block device information present: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} spawn /opt/stack/new/nova/nova/virt/xenapi/vmops.py:408 2015-08-07 18:17:58.414 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Detected vhd format for image a69d8d55-1745-492f-be26-cc75d64fc94d determine_disk_image_type /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1633 2015-08-07 18:17:58.415 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 10 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:17:58.726 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] block device info: {'swap': None, 'ephemerals': [], 'block_device_mapping': [], 'root_device_name': u'/dev/xvda'} _connect_cinder_volumes /opt/stack/new/nova/nova/virt/xenapi/vmops.py:376 2015-08-07 18:17:58.746 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" acquired by "_create_cached_image_impl" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:17:59.173 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:00.125 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Cloned VDI OpaqueRef:f4e1484f-43c5-63a6-a095-cf252725fc5e from VDI OpaqueRef:f392c64e-7886-a0cc-d3a0-780674167307 _clone_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:571 2015-08-07 18:18:00.989 13318 DEBUG nova.network.base_api [-] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:96:12:ef', 'active': False, 'type': u'bridge', 'id': u'c3c08e61-20cd-4e87-b1df-6b5a69146a86', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:18:01.021 13318 DEBUG oslo_concurrency.lockutils [-] Releasing semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:18:01.022 13318 DEBUG nova.compute.manager [-] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Instance network_info: |[VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:96:12:ef', 'active': False, 'type': u'bridge', 'id': u'c3c08e61-20cd-4e87-b1df-6b5a69146a86', 'qbg_params': None})]| _allocate_network_async /opt/stack/new/nova/nova/compute/manager.py:1531 2015-08-07 18:18:01.043 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-image-cachea69d8d55-1745-492f-be26-cc75d64fc94d" released by "_create_cached_image_impl" :: held 2.296s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:01.043 INFO nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Image creation data, cacheable: True, downloaded: False duration: 2.32 secs for image a69d8d55-1745-492f-be26-cc75d64fc94d 2015-08-07 18:18:02.475 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 20 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:02.781 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 30 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:03.045 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Using PV kernel: True _create_vm_record /opt/stack/new/nova/nova/virt/xenapi/vmops.py:679 2015-08-07 18:18:03.057 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Created VM create_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:312 2015-08-07 18:18:03.068 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 40 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:03.353 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:c78ad8cd-aaed-31c0-c810-0516a502844b, VDI OpaqueRef:f4e1484f-43c5-63a6-a095-cf252725fc5e ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:18:03.364 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:5a53387c-7dd1-fc03-2977-68ade56aa33e for VM OpaqueRef:c78ad8cd-aaed-31c0-c810-0516a502844b, VDI OpaqueRef:f4e1484f-43c5-63a6-a095-cf252725fc5e. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:18:03.783 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07 (config-2, 67108864, False) on OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99. create_vdi /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:509 2015-08-07 18:18:03.788 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:18:03.806 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:5480b487-a3da-139e-f6e0-6b27e32381e4 for VM OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9, VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:18:03.807 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:5480b487-a3da-139e-f6e0-6b27e32381e4 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2191 2015-08-07 18:18:03.807 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_plug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:05.377 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_plug" :: held 1.570s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:05.378 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Plugging VBD OpaqueRef:5480b487-a3da-139e-f6e0-6b27e32381e4 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2194 2015-08-07 18:18:05.383 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VBD OpaqueRef:5480b487-a3da-139e-f6e0-6b27e32381e4 plugged as xvdc vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2197 2015-08-07 18:18:05.496 WARNING nova.virt.configdrive [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] The setting "always" will be deprecated in the Liberty version. Please use "True" instead 2015-08-07 18:18:05.497 DEBUG nova.objects.instance [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `ec2_ids' on Instance uuid 661718a1-3aa7-4077-822b-7d12466ea624 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:18:05.588 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): genisoimage -o /tmp/tmp49_tyD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJehcsy execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:18:05.778 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "genisoimage -o /tmp/tmp49_tyD/configdrive -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Nova 12.0.0 -quiet -J -r -V config-2 /tmp/tmpJehcsy" returned: 0 in 0.190s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:18:05.784 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp49_tyD/configdrive of=/dev/xvdc oflag=direct,sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:18:06.503 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:06.613 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Triggering sync for uuid 661718a1-3aa7-4077-822b-7d12466ea624 _sync_power_states /opt/stack/new/nova/nova/compute/manager.py:5780 2015-08-07 18:18:06.615 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.91 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:07.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:07.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:18:07.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:18:07.587 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5329 2015-08-07 18:18:07.588 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:18:07.588 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:09.121 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:10.581 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:10.582 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:15.283 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf dd if=/tmp/tmp49_tyD/configdrive of=/dev/xvdc oflag=direct,sync" returned: 0 in 9.499s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:18:15.286 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Running cmd (subprocess): sudo nova-rootwrap /etc/nova/rootwrap.conf sync execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:230 2015-08-07 18:18:15.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:15.532 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:15.815 DEBUG oslo_concurrency.processutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] CMD "sudo nova-rootwrap /etc/nova/rootwrap.conf sync" returned: 0 in 0.529s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:260 2015-08-07 18:18:15.816 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07 ... vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2207 2015-08-07 18:18:15.817 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" acquired by "synchronized_unplug" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:16.863 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenapi-vbd-OpaqueRef:225f8c94-4904-b967-b19d-dc8b3c172ee9" released by "synchronized_unplug" :: held 1.046s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:16.874 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Destroying VBD for VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07 done. vdi_attached_here /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:2215 2015-08-07 18:18:16.876 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Creating disk-type VBD for VM OpaqueRef:c78ad8cd-aaed-31c0-c810-0516a502844b, VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07 ... create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:446 2015-08-07 18:18:16.886 DEBUG nova.virt.xenapi.vm_utils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Created VBD OpaqueRef:d21706e5-f024-f8d9-0006-f9579600b9ad for VM OpaqueRef:c78ad8cd-aaed-31c0-c810-0516a502844b, VDI OpaqueRef:d850726f-536d-2944-08a2-cc4d76b88d07. create_vbd /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:450 2015-08-07 18:18:16.887 DEBUG nova.objects.instance [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `pci_devices' on Instance uuid 661718a1-3aa7-4077-822b-7d12466ea624 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:18:17.032 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 50 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:17.413 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" acquired by "store_meta" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:17.414 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" released by "store_meta" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:17.415 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" acquired by "store_auto_disk_config" :: waited 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:17.429 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" released by "store_auto_disk_config" :: held 0.014s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:17.430 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Injecting hostname (tempest-testserveradvancedops-366290237) into xenstore _inject_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1940 2015-08-07 18:18:17.430 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:17.439 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" released by "update_hostname" :: held 0.009s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:17.440 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Injecting network info to xenstore inject_network_info /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1869 2015-08-07 18:18:17.441 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" acquired by "update_nwinfo" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:17.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:17.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:17.675 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" released by "update_nwinfo" :: held 0.234s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:17.675 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 60 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:18.162 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Creating vifs _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1892 2015-08-07 18:18:18.173 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Creating VIF for network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1902 2015-08-07 18:18:18.183 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Created VIF OpaqueRef:c617e04d-3efc-67b4-9d93-c1f11b235950, network OpaqueRef:fa91a786-d791-a343-2f0a-f9b91af7a749 _create_vifs /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1906 2015-08-07 18:18:18.184 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 70 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:18.642 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Starting instance _start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:327 2015-08-07 18:18:19.124 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:21.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:21.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:18:21.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:26.091 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Waiting for instance state to become running _wait_for_instance_to_start /opt/stack/new/nova/nova/virt/xenapi/vmops.py:777 2015-08-07 18:18:26.151 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 80 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:26.521 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Skip agent setup, not enabled. _configure_new_instance_with_agent /opt/stack/new/nova/nova/virt/xenapi/vmops.py:788 2015-08-07 18:18:26.521 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Removing hostname from xenstore _remove_hostname /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1949 2015-08-07 18:18:26.522 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" acquired by "update_hostname" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:26.527 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "xenstore-661718a1-3aa7-4077-822b-7d12466ea624" released by "update_hostname" :: held 0.005s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:26.528 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 90 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:26.539 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:26.541 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.98 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:26.803 DEBUG nova.virt.xenapi.vmops [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating progress to 100 _update_instance_progress /opt/stack/new/nova/nova/virt/xenapi/vmops.py:918 2015-08-07 18:18:27.135 DEBUG nova.compute.manager [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:18:27.520 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:27.578 DEBUG oslo_concurrency.lockutils [req-7be2059e-f05a-49cc-a37b-830bba12d438 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624" released by "_locked_do_build_and_run_instance" :: held 31.410s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:27.580 13318 DEBUG oslo_concurrency.lockutils [-] Lock "661718a1-3aa7-4077-822b-7d12466ea624" acquired by "query_driver_power_state_and_sync" :: waited 20.965s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:27.580 13318 INFO nova.compute.manager [-] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] During sync_power_state the instance has a pending task (spawning). Skip. 2015-08-07 18:18:27.580 13318 DEBUG oslo_concurrency.lockutils [-] Lock "661718a1-3aa7-4077-822b-7d12466ea624" released by "query_driver_power_state_and_sync" :: held 0.001s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:27.601 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:27.610 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:27.658 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:18:27.659 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:18:27.940 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:27.941 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:18:28.504 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.564s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:28.832 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 1 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:18:28.832 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:18:28.833 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=922MB free_disk=12GB free_vcpus=1 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:18:28.833 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:18:29.185 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 7 2015-08-07 18:18:29.186 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=581MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=7 pci_stats=None 2015-08-07 18:18:29.318 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.71 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:29.331 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:18:29.332 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.498s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:18:29.334 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.19 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:32.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:32.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 8.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:37.631 DEBUG nova.compute.manager [req-1534a310-2b35-4ff4-883b-6ad0bf975a74 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:18:39.132 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:39.536 INFO nova.compute.manager [req-0451a53d-ad65-4c6a-8fae-5fe16648b906 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Resuming 2015-08-07 18:18:39.539 DEBUG oslo_concurrency.lockutils [req-0451a53d-ad65-4c6a-8fae-5fe16648b906 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Acquired semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:18:39.713 DEBUG nova.network.base_api [req-0451a53d-ad65-4c6a-8fae-5fe16648b906 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:96:12:ef', 'active': False, 'type': u'bridge', 'id': u'c3c08e61-20cd-4e87-b1df-6b5a69146a86', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:18:39.746 DEBUG oslo_concurrency.lockutils [req-0451a53d-ad65-4c6a-8fae-5fe16648b906 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Releasing semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:18:40.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:18:40.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 28.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:48.215 DEBUG nova.compute.manager [req-0451a53d-ad65-4c6a-8fae-5fe16648b906 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:18:49.094 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:18:59.100 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:02.886 DEBUG nova.compute.manager [req-2583d641-30a4-4551-a329-1eb6e791b628 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:19:04.585 INFO nova.compute.manager [req-919e4b73-12ec-4f93-b9f0-8f807609f4dd tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Resuming 2015-08-07 18:19:04.587 DEBUG oslo_concurrency.lockutils [req-919e4b73-12ec-4f93-b9f0-8f807609f4dd tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Acquired semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:19:04.785 DEBUG nova.network.base_api [req-919e4b73-12ec-4f93-b9f0-8f807609f4dd tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:96:12:ef', 'active': False, 'type': u'bridge', 'id': u'c3c08e61-20cd-4e87-b1df-6b5a69146a86', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:19:04.821 DEBUG oslo_concurrency.lockutils [req-919e4b73-12ec-4f93-b9f0-8f807609f4dd tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Releasing semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:19:08.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:08.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:19:08.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:19:08.595 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Acquired semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:197 2015-08-07 18:19:08.595 DEBUG nova.objects.instance [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lazy-loading `flavor' on Instance uuid 661718a1-3aa7-4077-822b-7d12466ea624 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:19:09.167 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:09.333 DEBUG nova.network.base_api [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updating instance_info_cache with network_info: [VIF({'profile': None, 'ovs_interfaceid': None, 'preserve_on_delete': False, 'network': Network({'bridge': u'vmnet', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.1.0.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.1.0.0/20', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.1.0.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.1.0.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True, u'bridge_interface': u'eth3'}, 'id': u'5326014b-be72-4d73-b65f-ac2385d1690b', 'label': u'private'}), 'devname': None, 'vnic_type': u'normal', 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:96:12:ef', 'active': False, 'type': u'bridge', 'id': u'c3c08e61-20cd-4e87-b1df-6b5a69146a86', 'qbg_params': None})] update_instance_cache_with_nw_info /opt/stack/new/nova/nova/network/base_api.py:43 2015-08-07 18:19:09.373 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Releasing semaphore "refresh_cache-661718a1-3aa7-4077-822b-7d12466ea624" lock /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:210 2015-08-07 18:19:09.374 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Updated the network info_cache for instance _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5376 2015-08-07 18:19:09.375 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:12.368 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:12.369 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.16 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:14.732 DEBUG nova.compute.manager [req-919e4b73-12ec-4f93-b9f0-8f807609f4dd tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Checking state _get_power_state /opt/stack/new/nova/nova/compute/manager.py:1296 2015-08-07 18:19:16.152 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624" acquired by "do_terminate_instance" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:19:16.154 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624-events" acquired by "_clear_events" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:19:16.155 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624-events" released by "_clear_events" :: held 0.002s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:19:16.158 INFO nova.compute.manager [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Terminating instance 2015-08-07 18:19:16.159 INFO nova.virt.xenapi.vmops [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Destroying VM 2015-08-07 18:19:16.180 DEBUG nova.virt.xenapi.vm_utils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Shutting down VM (hard) hard_shutdown_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:348 2015-08-07 18:19:17.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:17.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:18.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:18.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:19.195 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.84 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:19.559 DEBUG nova.virt.xenapi.vmops [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Destroying VDIs _destroy_vdis /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1428 2015-08-07 18:19:19.572 DEBUG nova.virt.xenapi.vm_utils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI b7d70ef8-8252-4208-8cd5-b704785e2114 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:19:19.594 DEBUG nova.virt.xenapi.vm_utils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] VDI 009af4f8-d58d-40ce-be5d-a0b798d45149 is still available lookup_vm_vdis /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1683 2015-08-07 18:19:20.691 DEBUG nova.virt.xenapi.vmops [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Using RAW or VHD, skipping kernel and ramdisk deletion _destroy_kernel_ramdisk /opt/stack/new/nova/nova/virt/xenapi/vmops.py:1456 2015-08-07 18:19:20.704 DEBUG nova.virt.xenapi.vm_utils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] VM destroyed destroy_vm /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:324 2015-08-07 18:19:20.705 DEBUG nova.compute.manager [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] Deallocating network for instance _deallocate_network /opt/stack/new/nova/nova/compute/manager.py:1792 2015-08-07 18:19:21.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:21.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:19:21.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 6.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:22.991 DEBUG nova.compute.manager [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] [instance: 661718a1-3aa7-4077-822b-7d12466ea624] terminating bdm BlockDeviceMapping(boot_index=0,connection_info=None,created_at=2015-08-07T18:17:55Z,delete_on_termination=True,deleted=False,deleted_at=None,destination_type='local',device_name='/dev/xvda',device_type='disk',disk_bus=None,guest_format=None,id=93,image_id='a69d8d55-1745-492f-be26-cc75d64fc94d',instance=,instance_uuid=661718a1-3aa7-4077-822b-7d12466ea624,no_device=False,snapshot_id=None,source_type='image',updated_at=2015-08-07T18:17:57Z,volume_id=None,volume_size=None) _cleanup_volumes /opt/stack/new/nova/nova/compute/manager.py:2272 2015-08-07 18:19:23.196 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" acquired by "update_usage" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:19:23.197 DEBUG nova.objects.instance [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lazy-loading `numa_topology' on Instance uuid 661718a1-3aa7-4077-822b-7d12466ea624 obj_load_attr /opt/stack/new/nova/nova/objects/instance.py:877 2015-08-07 18:19:23.331 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "compute_resources" released by "update_usage" :: held 0.134s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:19:23.869 DEBUG oslo_concurrency.lockutils [req-1daac1dc-77ce-4015-a98d-4c64dd8aae33 tempest-TestServerAdvancedOps-875032165 tempest-TestServerAdvancedOps-1109115352] Lock "661718a1-3aa7-4077-822b-7d12466ea624" released by "do_terminate_instance" :: held 7.716s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:19:27.527 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:27.528 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:29.132 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.90 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:29.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:29.563 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:19:29.563 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:19:29.884 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:19:29.885 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:19:30.377 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.493s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:19:30.655 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:19:30.656 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:19:30.656 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:19:30.657 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:19:30.839 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:19:30.839 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:19:30.963 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:19:30.964 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.307s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:19:30.964 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.56 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:33.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:33.536 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:39.106 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.93 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:42.536 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:19:42.537 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 27.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:49.855 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.22 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:19:59.196 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:09.232 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:10.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:10.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:20:10.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:20:10.590 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:20:10.591 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:11.583 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:11.584 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.94 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:17.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:17.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:18.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:18.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:19.389 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.69 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:21.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:21.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:20:21.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 5.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:27.521 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:27.589 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.01 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:28.602 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:28.604 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:29.381 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.70 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:30.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:30.563 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:20:30.564 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:20:31.247 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:20:31.248 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:20:31.925 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.678s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:20:32.349 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:20:32.350 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:20:32.351 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:20:32.351 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:20:32.525 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:20:32.526 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:20:32.634 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:20:32.635 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.284s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:20:32.636 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.89 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:35.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:35.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 9.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:39.200 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.88 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:44.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:20:44.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 27.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:49.277 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:20:59.224 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.86 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:09.166 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.92 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:11.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:11.528 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Starting heal instance info cache _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5315 2015-08-07 18:21:11.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Rebuilding the list of instances to heal _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5319 2015-08-07 18:21:11.580 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Didn't find any instances for network info cache update. _heal_instance_info_cache /opt/stack/new/nova/nova/compute/manager.py:5386 2015-08-07 18:21:11.581 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 0.99 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:12.573 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:12.574 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 4.95 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:17.530 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:17.531 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 2.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:19.218 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.87 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:19.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:19.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:22.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:22.529 DEBUG nova.compute.manager [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /opt/stack/new/nova/nova/compute/manager.py:5976 2015-08-07 18:21:22.530 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 7.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:29.233 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.85 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:29.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:29.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 3.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:32.529 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:32.582 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Auditing locally available compute resources for node localhost.localdomain 2015-08-07 18:21:32.583 DEBUG nova.virt.xenapi.host [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Updating host stats update_status /opt/stack/new/nova/nova/virt/xenapi/host.py:234 2015-08-07 18:21:32.860 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" acquired by "do_scan" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:21:32.861 DEBUG nova.virt.xenapi.vm_utils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Scanning SR OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99 do_scan /opt/stack/new/nova/nova/virt/xenapi/vm_utils.py:1839 2015-08-07 18:21:33.337 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "sr-scan-OpaqueRef:d894db6d-a7db-ed22-27ed-3259530c5b99" released by "do_scan" :: held 0.477s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:21:33.630 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: free VCPUs: 2 _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:524 2015-08-07 18:21:33.631 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor: assignable PCI devices: [] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:532 2015-08-07 18:21:33.631 DEBUG nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Hypervisor/Node resource view: name=localhost.localdomain free_ram=989MB free_disk=12GB free_vcpus=2 pci_devices=[] _report_hypervisor_resource_view /opt/stack/new/nova/nova/compute/resource_tracker.py:546 2015-08-07 18:21:33.632 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" acquired by "_update_available_resource" :: waited 0.000s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:251 2015-08-07 18:21:33.782 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Total usable vcpus: 8, total allocated vcpus: 6 2015-08-07 18:21:33.783 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Final resource view: name=localhost.localdomain phys_ram=8187MB used_ram=512MB phys_disk=27GB used_disk=0GB total_vcpus=8 used_vcpus=6 pci_stats=None 2015-08-07 18:21:33.875 INFO nova.compute.resource_tracker [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Compute_service record updated for devstack:localhost.localdomain 2015-08-07 18:21:33.875 DEBUG oslo_concurrency.lockutils [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Lock "compute_resources" released by "_update_available_resource" :: held 0.243s inner /usr/local/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:262 2015-08-07 18:21:33.877 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 1.65 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:35.528 DEBUG oslo_service.periodic_task [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/local/lib/python2.7/dist-packages/oslo_service/periodic_task.py:213 2015-08-07 18:21:35.529 DEBUG oslo_service.loopingcall [req-129cdac2-1a3a-458c-ab38-7eeebcea9672 None None] Dynamic interval looping call > sleeping for 10.00 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117 2015-08-07 18:21:39.278 13318 DEBUG oslo_service.loopingcall [-] Fixed interval looping call > sleeping for 9.81 seconds _run_loop /usr/local/lib/python2.7/dist-packages/oslo_service/loopingcall.py:117