<div dir="auto">Hi,<div dir="auto"><br></div><div dir="auto">Yes, it is....</div><div dir="auto"><br></div><div dir="auto">The qemu-kvm process is executed by the oneadmin user.</div><div dir="auto"><br></div><div dir="auto">When I cluster tries the live migration, what users do play?</div><div dir="auto"><br></div><div dir="auto">Oneadmin</div><div dir="auto">Root</div><div dir="auto">Hacluster</div><div dir="auto"><br></div><div dir="auto">I have just configured pasworless ssh connection with oneadmin.</div><div dir="auto"><br></div><div dir="auto">Do I need to configure any other passwordless ssh connection with any other user?</div><div dir="auto"><br></div><div dir="auto">What user executes the virsh migrate - - live?</div><div dir="auto"><br></div><div dir="auto">Is there any way to check ssk keys? </div><div dir="auto"><br></div><div dir="auto">Sorry for all theese questions. </div><div dir="auto"><br></div><div dir="auto">Thanks a lot </div><div dir="auto"><br></div><div dir="auto"><br></div><br><div class="gmail_extra" dir="auto"><br><div class="gmail_quote">El 1 sept. 2017 0:12, "Ken Gaillot" <<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>> escribió:<br type="attribution"><blockquote class="quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="elided-text">On Thu, 2017-08-31 at 23:45 +0200, Oscar Segarra wrote:<br>
> Hi Ken,<br>
><br>
><br>
> Thanks a lot for you quick answer:<br>
><br>
><br>
> Regarding to selinux, it is disabled. The FW is disabled as well.<br>
><br>
><br>
> [root@vdicnode01 ~]# sestatus<br>
> SELinux status:                 disabled<br>
><br>
><br>
> [root@vdicnode01 ~]# service firewalld status<br>
> Redirecting to /bin/systemctl status  firewalld.service<br>
> ● firewalld.service - firewalld - dynamic firewall daemon<br>
>    Loaded: loaded (/usr/lib/systemd/system/<wbr>firewalld.service;<br>
> disabled; vendor preset: enabled)<br>
>    Active: inactive (dead)<br>
>      Docs: man:firewalld(1)<br>
><br>
><br>
> On migration, it performs a gracefully shutdown and a start on the new<br>
> node.<br>
><br>
><br>
> I attach the logs when trying to migrate from vdicnode02 to<br>
> vdicnode01:<br>
><br>
><br>
> vdicnode02 corosync.log:<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.161.2 2<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.162.0 (null)<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op:<br>
> -- /cib/configuration/<wbr>constraints/rsc_location[@id='<wbr>location-vm-vdicdb01-<wbr>vdicnode01--INFINITY']<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @epoch=162, @num_updates=0<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_replace operation for section<br>
> configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
> version=0.162.0)<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_file_backup:        Archived previous version<br>
> as /var/lib/pacemaker/cib/cib-65.<wbr>raw<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_file_write_with_digest:     Wrote version 0.162.0 of the CIB to<br>
> disk (digest: 1f87611b60cd7c48b95b6b788b47f6<wbr>5f)<br>
> Aug 31 23:38:17 [1521] vdicnode02        cib:     info:<br>
> cib_file_write_with_digest:     Reading cluster configuration<br>
> file /var/lib/pacemaker/cib/cib.<wbr>jt2KPw<br>
> (digest: /var/lib/pacemaker/cib/cib.<wbr>Kwqfpl)<br>
> Aug 31 23:38:22 [1521] vdicnode02        cib:     info:<br>
> cib_process_ping:       Reporting our current digest to vdicnode01:<br>
> dace3a23264934279d439420d5a716<wbr>cc for 0.162.0 (0x7f96bb26c5c0 0)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.162.0 2<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.0 (null)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @epoch=163<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: ++ /cib/configuration/<wbr>constraints:  <rsc_location<br>
> id="location-vm-vdicdb01-<wbr>vdicnode02--INFINITY" node="vdicnode02"<br>
> rsc="vm-vdicdb01" score="-INFINITY"/><br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_replace operation for section<br>
> configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
> version=0.163.0)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_file_backup:        Archived previous version<br>
> as /var/lib/pacemaker/cib/cib-66.<wbr>raw<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_file_write_with_digest:     Wrote version 0.163.0 of the CIB to<br>
> disk (digest: 47a548b36746de9275d66cc6aeb0fd<wbr>c4)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_file_write_with_digest:     Reading cluster configuration<br>
> file /var/lib/pacemaker/cib/cib.<wbr>rcgXiT<br>
> (digest: /var/lib/pacemaker/cib/cib.<wbr>7geMfi)<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info:<br>
> cancel_recurring_action:        Cancelling ocf operation<br>
> vm-vdicdb01_monitor_10000<br>
> Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
> do_lrm_rsc_op:  Performing<br>
> key=6:6:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a<br>
> op=vm-vdicdb01_migrate_to_0<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_execute:<br>
>    executing - rsc:vm-vdicdb01 action:migrate_to call_id:9<br>
> Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
> process_lrm_event:      Result of monitor operation for vm-vdicdb01 on<br>
> vdicnode02: Cancelled | call=7 key=vm-vdicdb01_monitor_10000<br>
> confirmed=true<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.0 2<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.1 (null)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=1<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>migrate_to_0, @operation=migrate_to, @crm-debug-origin=cib_action_<wbr>update, @transition-key=6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=-1:193;6:6:<wbr>0:fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=-1, @rc-code=193, @op-status=-1, @last-run=1504215507, @last-rc-cha<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/41, version=0.163.1)<br>
> VirtualDomain(vm-vdicdb01)[<wbr>5241]:       2017/08/31_23:38:27 INFO:<br>
> vdicdb01: Starting live migration to vdicnode01 (using: virsh<br>
> --connect=qemu:///system --quiet migrate --live  vdicdb01 qemu<br>
> +ssh://vdicnode01/system ).<br>
> VirtualDomain(vm-vdicdb01)[<wbr>5241]:       2017/08/31_23:38:27 ERROR:<br>
> vdicdb01: live migration to vdicnode01 failed: 1<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:   notice:<br>
> operation_finished:     vm-vdicdb01_migrate_to_0:5241:<wbr>stderr [ error:<br>
> Cannot recv data: Host key verification failed.: Connection reset by<br>
> peer ]<br>
<br>
</div>^^^ There you go. Sounds like the ssh key isn't being accepted. No idea<br>
why though.<br>
<div class="elided-text"><br>
<br>
<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:   notice:<br>
> operation_finished:     vm-vdicdb01_migrate_to_0:5241:<wbr>stderr<br>
> [ ocf-exit-reason:vdicdb01: live migration to vdicnode01 failed: 1 ]<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_finished:<br>
> finished - rsc:vm-vdicdb01 action:migrate_to call_id:9 pid:5241<br>
> exit-code:1 exec-time:78ms queue-time:0ms<br>
> Aug 31 23:38:27 [1526] vdicnode02       crmd:   notice:<br>
> process_lrm_event:      Result of migrate_to operation for vm-vdicdb01<br>
> on vdicnode02: 1 (unknown error) | call=9 key=vm-vdicdb01_migrate_to_0<br>
> confirmed=true cib-update=14<br>
> Aug 31 23:38:27 [1526] vdicnode02       crmd:   notice:<br>
> process_lrm_event:      vdicnode02-vm-vdicdb01_<wbr>migrate_to_0:9 [ error:<br>
> Cannot recv data: Host key verification failed.: Connection reset by<br>
> peer\nocf-exit-reason:<wbr>vdicdb01: live migration to vdicnode01 failed: 1<br>
> \n ]<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/14)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.1 2<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.2 (null)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=2<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @crm-debug-origin=do_update_<wbr>resource, @transition-magic=0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=9, @rc-code=1, @op-status=0, @exec-time=78, @exit-reason=vdicdb01: live migration to vdicnode01 failed: 1<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op:<br>
> ++ /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]:  <lrm_rsc_op id="vm-vdicdb01_last_failure_<wbr>0" operation_key="vm-vdicdb01_<wbr>migrate_to_0" operation="migrate_to" crm-debug-origin="do_update_<wbr>resource" crm_feature_set="3.0.10" transition-key="6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" transition-magic="0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" exit-reason="vdicdb01: live migration to vdicn<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode02/crmd/14, version=0.163.2)<br>
> Aug 31 23:38:27 [1526] vdicnode02       crmd:     info:<br>
> do_lrm_rsc_op:  Performing<br>
> key=2:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_stop_0<br>
> Aug 31 23:38:27 [1523] vdicnode02       lrmd:     info: log_execute:<br>
>    executing - rsc:vm-vdicdb01 action:stop call_id:10<br>
> VirtualDomain(vm-vdicdb01)[<wbr>5285]:       2017/08/31_23:38:27 INFO:<br>
> Issuing graceful shutdown request for domain vdicdb01.<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.2 2<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.3 (null)<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=3<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=6, @rc-code=0, @last-run=1504215507, @last-rc-change=1504215507, @exec-time=57<br>
> Aug 31 23:38:27 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/43, version=0.163.3)<br>
> Aug 31 23:38:30 [1523] vdicnode02       lrmd:     info: log_finished:<br>
> finished - rsc:vm-vdicdb01 action:stop call_id:10 pid:5285 exit-code:0<br>
> exec-time:3159ms queue-time:0ms<br>
> Aug 31 23:38:30 [1526] vdicnode02       crmd:   notice:<br>
> process_lrm_event:      Result of stop operation for vm-vdicdb01 on<br>
> vdicnode02: 0 (ok) | call=10 key=vm-vdicdb01_stop_0 confirmed=true<br>
> cib-update=15<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/15)<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.3 2<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.4 (null)<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=4<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=10, @rc-code=0, @exec-time=3159<br>
> Aug 31 23:38:30 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode02/crmd/15, version=0.163.4)<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.4 2<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.5 (null)<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=5<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>start_0, @operation=start, @transition-key=5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=7, @last-run=1504215510, @last-rc-change=1504215510, @exec-time=528<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/44, version=0.163.5)<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.5 2<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.6 (null)<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=6<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_perform_op:<br>
> ++ /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]:  <lrm_rsc_op id="vm-vdicdb01_monitor_10000" operation_key="vm-vdicdb01_<wbr>monitor_10000" operation="monitor" crm-debug-origin="do_update_<wbr>resource" crm_feature_set="3.0.10" transition-key="6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" transition-magic="0:0;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" on_node="vdicnode01" call-id="8" rc-code="0" op-s<br>
> Aug 31 23:38:31 [1521] vdicnode02        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/45, version=0.163.6)<br>
> Aug 31 23:38:36 [1521] vdicnode02        cib:     info:<br>
> cib_process_ping:       Reporting our current digest to vdicnode01:<br>
> 9141ea9880f5a44b133003982d863b<wbr>c8 for 0.163.6 (0x7f96bb2625a0 0)<br>
><br>
><br>
><br>
><br>
><br>
><br>
> vdicnode01 - corosync.log<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Forwarding cib_replace operation for section<br>
> configuration to all (origin=local/cibadmin/2)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.162.0 2<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.0 (null)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @epoch=163<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: ++ /cib/configuration/<wbr>constraints:  <rsc_location<br>
> id="location-vm-vdicdb01-<wbr>vdicnode02--INFINITY" node="vdicnode02"<br>
> rsc="vm-vdicdb01" score="-INFINITY"/><br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_replace operation for section<br>
> configuration: OK (rc=0, origin=vdicnode01/cibadmin/2,<br>
> version=0.163.0)<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> abort_transition_graph: Transition aborted by<br>
> rsc_location.location-vm-<wbr>vdicdb01-vdicnode02--INFINITY 'create':<br>
> Non-status change | cib=0.163.0 source=te_update_diff:436<br>
> path=/cib/configuration/<wbr>constraints complete=true<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> do_state_transition:    State transition S_IDLE -> S_POLICY_ENGINE |<br>
> input=I_PE_CALC cause=C_FSA_INTERNAL origin=abort_transition_graph<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_file_backup:        Archived previous version<br>
> as /var/lib/pacemaker/cib/cib-85.<wbr>raw<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_file_write_with_digest:     Wrote version 0.163.0 of the CIB to<br>
> disk (digest: 47a548b36746de9275d66cc6aeb0fd<wbr>c4)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_file_write_with_digest:     Reading cluster configuration<br>
> file /var/lib/pacemaker/cib/cib.<wbr>npBIW2<br>
> (digest: /var/lib/pacemaker/cib/cib.<wbr>bDogoB)<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
> determine_online_status:        Node vdicnode02 is online<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
> determine_online_status:        Node vdicnode01 is online<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
> vm-vdicdb01     (ocf::heartbeat:VirtualDomain)<wbr>: Started vdicnode02<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: RecurringOp:<br>
> Start recurring monitor (10s) for vm-vdicdb01 on vdicnode01<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice: LogActions:<br>
> Migrate vm-vdicdb01     (Started vdicnode02 -> vdicnode01)<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice:<br>
> process_pe_message:     Calculated transition 6, saving inputs<br>
> in /var/lib/pacemaker/pengine/pe-<wbr>input-96.bz2<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> do_state_transition:    State transition S_POLICY_ENGINE -><br>
> S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE<br>
> origin=handle_response<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info: do_te_invoke:<br>
> Processing graph 6 (ref=pe_calc-dc-1504215507-24) derived<br>
> from /var/lib/pacemaker/pengine/pe-<wbr>input-96.bz2<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> te_rsc_command: Initiating migrate_to operation<br>
> vm-vdicdb01_migrate_to_0 on vdicnode02 | action 6<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> create_operation_update:        cib_action_update: Updating resource<br>
> vm-vdicdb01 after migrate_to op pending (interval=0)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/41)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.0 2<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.1 (null)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=1<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>migrate_to_0, @operation=migrate_to, @crm-debug-origin=cib_action_<wbr>update, @transition-key=6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=-1:193;6:6:<wbr>0:fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=-1, @rc-code=193, @op-status=-1, @last-run=1504215507, @last-rc-cha<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/41, version=0.163.1)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.1 2<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.2 (null)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=2<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @crm-debug-origin=do_update_<wbr>resource, @transition-magic=0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=9, @rc-code=1, @op-status=0, @exec-time=78, @exit-reason=vdicdb01: live migration to vdicnode01 failed: 1<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op:<br>
> ++ /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]:  <lrm_rsc_op id="vm-vdicdb01_last_failure_<wbr>0" operation_key="vm-vdicdb01_<wbr>migrate_to_0" operation="migrate_to" crm-debug-origin="do_update_<wbr>resource" crm_feature_set="3.0.10" transition-key="6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" transition-magic="0:1;6:6:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" exit-reason="vdicdb01: live migration to vdicn<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode02/crmd/14, version=0.163.2)<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:  warning:<br>
> status_from_rc: Action 6 (vm-vdicdb01_migrate_to_0) on vdicnode02<br>
> failed (target: 0 vs. rc: 1): Error<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> abort_transition_graph: Transition aborted by operation<br>
> vm-vdicdb01_migrate_to_0 'modify' on vdicnode02: Event failed |<br>
> magic=0:1;6:6:0:fe1a9b0a-816c-<wbr>4b97-96cb-b90dbf71417a cib=0.163.2<br>
> source=match_graph_event:310 complete=false<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_migrate_to_0 (6) confirmed<br>
> on vdicnode02 (rc=1)<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> process_graph_event:    Detected action (6.6)<br>
> vm-vdicdb01_migrate_to_0.9=<wbr>unknown error: failed<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:  warning:<br>
> status_from_rc: Action 6 (vm-vdicdb01_migrate_to_0) on vdicnode02<br>
> failed (target: 0 vs. rc: 1): Error<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> abort_transition_graph: Transition aborted by operation<br>
> vm-vdicdb01_migrate_to_0 'create' on vdicnode02: Event failed |<br>
> magic=0:1;6:6:0:fe1a9b0a-816c-<wbr>4b97-96cb-b90dbf71417a cib=0.163.2<br>
> source=match_graph_event:310 complete=false<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_migrate_to_0 (6) confirmed<br>
> on vdicnode02 (rc=1)<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> process_graph_event:    Detected action (6.6)<br>
> vm-vdicdb01_migrate_to_0.9=<wbr>unknown error: failed<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice: run_graph:<br>
>    Transition 6 (Complete=1, Pending=0, Fired=0, Skipped=0,<br>
> Incomplete=5, Source=/var/lib/pacemaker/<wbr>pengine/pe-input-96.bz2):<br>
> Complete<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> do_state_transition:    State transition S_TRANSITION_ENGINE -><br>
> S_POLICY_ENGINE | input=I_PE_CALC cause=C_FSA_INTERNAL<br>
> origin=notify_crmd<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
> determine_online_status:        Node vdicnode02 is online<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info:<br>
> determine_online_status:        Node vdicnode01 is online<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
> unpack_rsc_op_failure:  Processing failed op migrate_to for<br>
> vm-vdicdb01 on vdicnode02: unknown error (1)<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
> unpack_rsc_op_failure:  Processing failed op migrate_to for<br>
> vm-vdicdb01 on vdicnode02: unknown error (1)<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
> vm-vdicdb01     (ocf::heartbeat:VirtualDomain)<wbr>: FAILED<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
> 1 : vdicnode01<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: native_print:<br>
> 2 : vdicnode02<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:    error:<br>
> native_create_actions:  Resource vm-vdicdb01 (ocf::VirtualDomain) is<br>
> active on 2 nodes attempting recovery<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:  warning:<br>
> native_create_actions:  See<br>
> <a href="http://clusterlabs.org/wiki/FAQ#Resource_is_Too_Active" rel="noreferrer" target="_blank">http://clusterlabs.org/wiki/<wbr>FAQ#Resource_is_Too_Active</a> for more<br>
> information.<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:     info: RecurringOp:<br>
> Start recurring monitor (10s) for vm-vdicdb01 on vdicnode01<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:   notice: LogActions:<br>
> Recover vm-vdicdb01     (Started vdicnode01)<br>
> Aug 31 23:38:27 [1535] vdicnode01    pengine:    error:<br>
> process_pe_message:     Calculated transition 7 (with errors), saving<br>
> inputs in /var/lib/pacemaker/pengine/pe-<wbr>error-8.bz2<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> do_state_transition:    State transition S_POLICY_ENGINE -><br>
> S_TRANSITION_ENGINE | input=I_PE_SUCCESS cause=C_IPC_MESSAGE<br>
> origin=handle_response<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info: do_te_invoke:<br>
> Processing graph 7 (ref=pe_calc-dc-1504215507-26) derived<br>
> from /var/lib/pacemaker/pengine/pe-<wbr>error-8.bz2<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> te_rsc_command: Initiating stop operation vm-vdicdb01_stop_0 locally<br>
> on vdicnode01 | action 4<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> do_lrm_rsc_op:  Performing<br>
> key=4:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_stop_0<br>
> Aug 31 23:38:27 [1533] vdicnode01       lrmd:     info: log_execute:<br>
>    executing - rsc:vm-vdicdb01 action:stop call_id:6<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> te_rsc_command: Initiating stop operation vm-vdicdb01_stop_0 on<br>
> vdicnode02 | action 2<br>
> VirtualDomain(vm-vdicdb01)[<wbr>5268]:       2017/08/31_23:38:27 INFO:<br>
> Domain vdicdb01 already stopped.<br>
> Aug 31 23:38:27 [1533] vdicnode01       lrmd:     info: log_finished:<br>
> finished - rsc:vm-vdicdb01 action:stop call_id:6 pid:5268 exit-code:0<br>
> exec-time:57ms queue-time:0ms<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:   notice:<br>
> process_lrm_event:      Result of stop operation for vm-vdicdb01 on<br>
> vdicnode01: 0 (ok) | call=6 key=vm-vdicdb01_stop_0 confirmed=true<br>
> cib-update=43<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/43)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.2 2<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.3 (null)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=3<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;4:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=6, @rc-code=0, @last-run=1504215507, @last-rc-change=1504215507, @exec-time=57<br>
> Aug 31 23:38:27 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_stop_0 (4) confirmed on<br>
> vdicnode01 (rc=0)<br>
> Aug 31 23:38:27 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/43, version=0.163.3)<br>
> Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.3 2<br>
> Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.4 (null)<br>
> Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=4<br>
> Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='2'<wbr>]/lrm[@id='2']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>stop_0, @operation=stop, @transition-key=2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;2:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=10, @rc-code=0, @exec-time=3159<br>
> Aug 31 23:38:30 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode02/crmd/15, version=0.163.4)<br>
> Aug 31 23:38:30 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_stop_0 (2) confirmed on<br>
> vdicnode02 (rc=0)<br>
> Aug 31 23:38:30 [1536] vdicnode01       crmd:   notice:<br>
> te_rsc_command: Initiating start operation vm-vdicdb01_start_0 locally<br>
> on vdicnode01 | action 5<br>
> Aug 31 23:38:30 [1536] vdicnode01       crmd:     info:<br>
> do_lrm_rsc_op:  Performing<br>
> key=5:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a op=vm-vdicdb01_start_0<br>
> Aug 31 23:38:30 [1533] vdicnode01       lrmd:     info: log_execute:<br>
>    executing - rsc:vm-vdicdb01 action:start call_id:7<br>
> Aug 31 23:38:31 [1533] vdicnode01       lrmd:     info: log_finished:<br>
> finished - rsc:vm-vdicdb01 action:start call_id:7 pid:5401 exit-code:0<br>
> exec-time:528ms queue-time:0ms<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
> action_synced_wait:     Managed VirtualDomain_meta-data_0 process 5486<br>
> exited with rc=0<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
> process_lrm_event:      Result of start operation for vm-vdicdb01 on<br>
> vdicnode01: 0 (ok) | call=7 key=vm-vdicdb01_start_0 confirmed=true<br>
> cib-update=44<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/44)<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.4 2<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.5 (null)<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=5<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +<br>
>  /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]/lrm_rsc_op[@id='vm-vdicdb01_<wbr>last_0']:  @operation_key=vm-vdicdb01_<wbr>start_0, @operation=start, @transition-key=5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @transition-magic=0:0;5:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a, @call-id=7, @last-run=1504215510, @last-rc-change=1504215510, @exec-time=528<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/44, version=0.163.5)<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_start_0 (5) confirmed on<br>
> vdicnode01 (rc=0)<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
> te_rsc_command: Initiating monitor operation vm-vdicdb01_monitor_10000<br>
> locally on vdicnode01 | action 6<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
> do_lrm_rsc_op:  Performing<br>
> key=6:7:0:fe1a9b0a-816c-4b97-<wbr>96cb-b90dbf71417a<br>
> op=vm-vdicdb01_monitor_10000<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
> process_lrm_event:      Result of monitor operation for vm-vdicdb01 on<br>
> vdicnode01: 0 (ok) | call=8 key=vm-vdicdb01_monitor_10000<br>
> confirmed=false cib-update=45<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Forwarding cib_modify operation for section<br>
> status to all (origin=local/crmd/45)<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: --- 0.163.5 2<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: Diff: +++ 0.163.6 (null)<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op: +  /cib:  @num_updates=6<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_perform_op:<br>
> ++ /cib/status/node_state[@id='1'<wbr>]/lrm[@id='1']/lrm_resources/<wbr>lrm_resource[@id='vm-vdicdb01'<wbr>]:  <lrm_rsc_op id="vm-vdicdb01_monitor_10000" operation_key="vm-vdicdb01_<wbr>monitor_10000" operation="monitor" crm-debug-origin="do_update_<wbr>resource" crm_feature_set="3.0.10" transition-key="6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" transition-magic="0:0;6:7:0:<wbr>fe1a9b0a-816c-4b97-96cb-<wbr>b90dbf71417a" on_node="vdicnode01" call-id="8" rc-code="0" op-s<br>
> Aug 31 23:38:31 [1531] vdicnode01        cib:     info:<br>
> cib_process_request:    Completed cib_modify operation for section<br>
> status: OK (rc=0, origin=vdicnode01/crmd/45, version=0.163.6)<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info:<br>
> match_graph_event:      Action vm-vdicdb01_monitor_10000 (6) confirmed<br>
> on vdicnode01 (rc=0)<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice: run_graph:<br>
>    Transition 7 (Complete=5, Pending=0, Fired=0, Skipped=0,<br>
> Incomplete=0, Source=/var/lib/pacemaker/<wbr>pengine/pe-error-8.bz2):<br>
> Complete<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:     info: do_log: Input<br>
> I_TE_SUCCESS received in state S_TRANSITION_ENGINE from notify_crmd<br>
> Aug 31 23:38:31 [1536] vdicnode01       crmd:   notice:<br>
> do_state_transition:    State transition S_TRANSITION_ENGINE -> S_IDLE<br>
> | input=I_TE_SUCCESS cause=C_FSA_INTERNAL origin=notify_crmd<br>
> Aug 31 23:38:36 [1531] vdicnode01        cib:     info:<br>
> cib_process_ping:       Reporting our current digest to vdicnode01:<br>
> 9141ea9880f5a44b133003982d863b<wbr>c8 for 0.163.6 (0x7f61cec09270 0)<br>
><br>
><br>
> Thanks a lot<br>
><br>
> 2017-08-31 16:20 GMT+02:00 Ken Gaillot <<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>>:<br>
>         On Thu, 2017-08-31 at 01:13 +0200, Oscar Segarra wrote:<br>
>         > Hi,<br>
>         ><br>
>         ><br>
>         > In my environment, I have just two hosts, where qemu-kvm<br>
>         process is<br>
>         > launched by a regular user (oneadmin) - open nebula -<br>
>         ><br>
>         ><br>
>         > I have created a VirtualDomain resource that starts and<br>
>         stops the VM<br>
>         > perfectly. Nevertheless, when I change the location weight<br>
>         in order to<br>
>         > force the migration, It raises a migration failure "error:<br>
>         1"<br>
>         ><br>
>         ><br>
>         > If I execute the virsh migrate command (that appears in<br>
>         corosync.log)<br>
>         > from command line, it works perfectly.<br>
>         ><br>
>         ><br>
>         > Anybody has experienced the same issue?<br>
>         ><br>
>         ><br>
>         > Thanks in advance for your help<br>
><br>
><br>
>         If something works from the command line but not when run by a<br>
>         daemon,<br>
>         my first suspicion is SELinux. Check the audit log for denials<br>
>         around<br>
>         that time.<br>
><br>
>         I'd also check the system log and Pacemaker detail log around<br>
>         that time<br>
>         to see if there is any more information.<br>
>         --<br>
>         Ken Gaillot <<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>><br>
><br>
><br>
><br>
><br>
><br>
>         ______________________________<wbr>_________________<br>
>         Users mailing list: <a href="mailto:Users@clusterlabs.org">Users@clusterlabs.org</a><br>
>         <a href="http://lists.clusterlabs.org/mailman/listinfo/users" rel="noreferrer" target="_blank">http://lists.clusterlabs.org/<wbr>mailman/listinfo/users</a><br>
><br>
>         Project Home: <a href="http://www.clusterlabs.org" rel="noreferrer" target="_blank">http://www.clusterlabs.org</a><br>
>         Getting started:<br>
>         <a href="http://www.clusterlabs.org/doc/Cluster_from_Scratch.pdf" rel="noreferrer" target="_blank">http://www.clusterlabs.org/<wbr>doc/Cluster_from_Scratch.pdf</a><br>
>         Bugs: <a href="http://bugs.clusterlabs.org" rel="noreferrer" target="_blank">http://bugs.clusterlabs.org</a><br>
><br>
><br>
<br>
</div><font color="#888888">--<br>
Ken Gaillot <<a href="mailto:kgaillot@redhat.com">kgaillot@redhat.com</a>><br>
<br>
<br>
<br>
<br>
</font></blockquote></div><br></div></div>