On 02/18/2011 04:11 AM, Ulrich Weigand wrote: >> > @@ -4338,10 +4341,15 @@ displaced_read_reg (struct regcache *regs, CORE_ADDR from, int regno) >> > >> > if (regno == 15) >> > { >> > + if (displaced_in_arm_mode (regs)) >> > + from += 8; >> > + else >> > + from += 6; > I think the 6 is wrong, it should be 4. From the ARM manual: > > - When executing an ARM instruction, PC reads as the address of the > current instruction plus 8. > - When executing a Thumb instruction, PC reads as the address of the > current instruction plus 4. > Oh, yes. Fixed. >> > +/* Clean up branch instructions (actually perform the branch, by setting >> > + PC). */ >> > +static void >> > +cleanup_branch(struct gdbarch *gdbarch, struct regcache *regs, >> > + struct displaced_step_closure *dsc) >> > +{ >> > + ULONGEST from = dsc->insn_addr; >> > + uint32_t status = displaced_read_reg (regs, from, ARM_PS_REGNUM); >> > + int branch_taken = condition_true (dsc->u.branch.cond, status); >> > + >> > + cleanup_branch_1 (gdbarch, regs, dsc, branch_taken); >> > +} >> > + >> > +static void >> > +cleanup_cbz_cbnz(struct gdbarch *gdbarch, struct regcache *regs, >> > + struct displaced_step_closure *dsc) >> > +{ >> > + cleanup_branch_1 (gdbarch, regs, dsc, dsc->u.branch.cond); >> > +} > I think this is unnecessary: copy_cbz_cnbz ought to be able to use > cleanup_branch as-is. If the branch is taken, it should just set > dsc->u.branch.cond to INSN_AL; if the branch is not taken, it > should simply not use any cleanup at all since no further action > is required. > Done as you suggested. >> > @@ -4718,6 +4752,40 @@ copy_b_bl_blx (struct gdbarch *gdbarch, uint32_t insn, >> > >> > B similar, but don't set r14 in cleanup. */ >> > >> > + >> > + dsc->u.branch.cond = cond; >> > + dsc->u.branch.link = link; >> > + dsc->u.branch.exchange = exchange; >> > + >> > + if (arm_pc_is_thumb (gdbarch, from)) > You should never use arm_pc_is_thumb here; the heuristics it applies are > completely unnecessary, since we know in which mode we are, and may just > result in the wrong outcome. > > In any case, as discussed above, this ought to be two separate copy > routines, one for ARM mode and one for Thumb mode anyway. > Yes, it is separated to two routines, arm_copy_b_bl_blx and thumb2_copy_b_bl_blx. >> > + { >> > + /* Plus the size of THUMB_NOP and B/BL/BLX. */ >> > + dsc->u.branch.dest = from + 2 + 4 + offset; >> > + RECORD_MOD_16BIT_INSN (0, THUMB_NOP); > The + 2 doesn't look right to me. The offset is relative to the > PC, which is -see above- "from + 8" in ARM mode and "from + 4" in > Thumb mode. I don't see how the size of the THUMB_NOP is involved > at all here ... > Oh, yes. Fixed. >> > +static int >> > +thumb_decode_dp (struct gdbarch *gdbarch, unsigned short insn, >> > + struct displaced_step_closure *dsc) >> > +{ >> > + /* 16-bit data-processing insns are not related to PC. */ >> > + return thumb_copy_unmodified_16bit (gdbarch, insn,"data-processing", dsc); >> > +} > This doesn't need to be a separate function, I guess ... > OK, fixed. >> > + /* ADDS Rd, #imm8 */ >> > + RECORD_MOD_32BIT_INSN (0, 0x3000 | (rd << 8) | imm8); > Should be 16BIT (but see my earlier mail on the usefulness of > those macros in the first place ...). > Fixed. >> > +static void >> > +thumb_process_displaced_16bit_insn (struct gdbarch *gdbarch, >> > + unsigned short insn1, CORE_ADDR to, > I don't think this needs TO. > Removed parameter TO. >> > + unsigned short op = bits (insn1, 7, 9); >> > + if (op == 6 || op == 7) /* BX or BLX */ >> > + err = thumb_copy_bx_blx_reg (gdbarch, insn1, regs, dsc); >> > + else >> > + err = thumb_copy_unmodified_16bit (gdbarch, insn1, "special data", >> > + dsc); > These include the ADD / MOV / CMP high register instructions, which > can access the PC, so they'd need special treatment > They are processed by thumb_copy_alu_reg now. >> > + switch (bits (insn1, 8, 11)) >> > + { >> > + case 1: case 3: case 9: case 11: /* CBNZ, CBZ */ >> > + err = thumb_copy_cbnz_cbz (gdbarch, insn1, regs, dsc); >> > + break; >> > + default: >> > + err = thumb_copy_unmodified_16bit (gdbarch, insn1,"", dsc); > Hmm, what about IT ? > So far, I don't have a good idea on support IT for displaced stepping. I may need more time to think of this. >> > + case 13: /* Conditional branch and supervisor call */ >> > + if (bits (insn1, 9, 11) != 7) /* conditional branch */ >> > + err = thumb_copy_b (gdbarch, insn1, dsc); >> > + else >> > + err = thumb_copy_unmodified_16bit (gdbarch, insn1,"svc", dsc); > There is special handling in arm-linux-tdep.c for ARM SVC instructions. > Don't we need this for Thumb SVC's as well? > Yes. We need that. SVC and Linux related stuff is new to me. I suggest this patch can be in first if it is OK. I'll think of 'svc stuff' later. >> > + >> > + if ((bits (insn1, 13, 15) == 7) && (bits (insn1, 11, 12))) > You should just use thumb_insn_size ... > Fixed. -- Yao (齐尧)