Merge rust-bitcoin/rust-bitcoin#1532: Improve Psbt error handling

e7bbfd3913 Improve Psbt error handling (DanGould)

Pull request description:

  ## Separate `encode::Error` and `psbt::Error` recursive dependency

  This initial work attempts to fix #837's first 2 points

  > - The current psbt::serialize::Deserialize has an error type of consensus::encode::Error. I think we should cleanly separate consensus encoding errors from application-level encoding errors like psbt.
  > - There is a recursive dependence between encode::Error and psbt::Error which would need to be cleanly dissected and separated so that there is no dependence or only one-way dependence.

  ## Better `ParseError(String)` types

  arturomf94 how compatible do your #1310 changes look to address #837's third point with this design?

  > - There are a lot ParseError(String) messages that could use a better type to downflow the information.

  I think your prior art would completely address this issue now.

  ## On handling `io::Error` with an associated error

  `encode::Error` has an `Io` variant. now that `Psbt::deserialize` returns `psbt::Error` and produces an `io::Error`, we need an `Io` variant on `psbt::Error`. Except that doing so breaks  `#[derive(Eq)]` and lots of tests for `psbt::Error`.

  Kixunil, I'm trying to understand your feedback regarding a solution to this problem.

  > I believe that the best error untangling would be to make decodable error associated.

  > I meant having associated `Error` type at `Decodable` trait. Encoding should only fail if the writer fails so we should have `io::Error` there (at least until we have something like `genio`).
  >
  > > [it] is a problem to instantiate consensus::encode::Error in [the psbt] module for `io::Error`?
  >
  > It certainly does look strange. Maybe we should have this shared type:
  >
  > ```rust
  > /// Error used when reading or decoding fails.
  > pub enum ReadError<Io, Decode> {
  >     /// Reading failed
  >     Io(Io),
  >     /// Decoding failed
  >     Decode(Decode), // consensus and PSBT error here
  > }
  > ```
  >
  > However this one will be annoying to use with `?` :( We could have `ResultExt` to provide `decode()` and `io()` methods to make it easier.
  >
  > If that's not acceptable then I think deduplicated IO error is better.

  Kixunil didn't we just get rid of Psbt as `Decodable`? Would this make more sense to have as an error associated with `Deserialize`? Or did we do the opposite of what we should have by making Psbt only `Serialize`/`Deserialize` because of #934, where only consensus objects are allowed to be `Decodable`? I wonder if we prioritized that strict categorization and are stuck with worth machinery because of it. My goal with #988 was to get to a point where we could address #837 and ultimately implement PSBTv2.

ACKs for top commit:
  tcharding:
    ACK e7bbfd3913
  apoelstra:
    ACK e7bbfd3913

Tree-SHA512: 32975594fde42727ea9030f46570a1403ae1a108570ab115519ebeddc28938f141e2134b04d6b29ce94817ed776c13815dea5647c463e4a13b47ba55f4e7858a
This commit is contained in:
Andrew Poelstra 2023-01-24 14:07:24 +00:00
commit f6d983b2ef
No known key found for this signature in database
GPG Key ID: C588D63CE41B97C1
10 changed files with 86 additions and 102 deletions

View File

@ -26,7 +26,6 @@ use crate::hashes::{sha256d, Hash, sha256};
use crate::hash_types::{BlockHash, FilterHash, TxMerkleNode, FilterHeader}; use crate::hash_types::{BlockHash, FilterHash, TxMerkleNode, FilterHeader};
use crate::io::{self, Cursor, Read}; use crate::io::{self, Cursor, Read};
use crate::psbt;
use crate::bip152::{ShortId, PrefilledTransaction}; use crate::bip152::{ShortId, PrefilledTransaction};
use crate::taproot::TapLeafHash; use crate::taproot::TapLeafHash;
@ -40,8 +39,6 @@ use crate::network::{message_blockdata::Inventory, address::{Address, AddrV2Mess
pub enum Error { pub enum Error {
/// And I/O error. /// And I/O error.
Io(io::Error), Io(io::Error),
/// PSBT-related error.
Psbt(psbt::Error),
/// Tried to allocate an oversized vector. /// Tried to allocate an oversized vector.
OversizedVectorAllocation { OversizedVectorAllocation {
/// The capacity requested. /// The capacity requested.
@ -68,7 +65,6 @@ impl fmt::Display for Error {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self { match *self {
Error::Io(ref e) => write_err!(f, "IO error"; e), Error::Io(ref e) => write_err!(f, "IO error"; e),
Error::Psbt(ref e) => write_err!(f, "PSBT error"; e),
Error::OversizedVectorAllocation { requested: ref r, max: ref m } => write!(f, Error::OversizedVectorAllocation { requested: ref r, max: ref m } => write!(f,
"allocation of oversized vector: requested {}, maximum {}", r, m), "allocation of oversized vector: requested {}, maximum {}", r, m),
Error::InvalidChecksum { expected: ref e, actual: ref a } => write!(f, Error::InvalidChecksum { expected: ref e, actual: ref a } => write!(f,
@ -89,7 +85,6 @@ impl std::error::Error for Error {
match self { match self {
Io(e) => Some(e), Io(e) => Some(e),
Psbt(e) => Some(e),
OversizedVectorAllocation { .. } OversizedVectorAllocation { .. }
| InvalidChecksum { .. } | InvalidChecksum { .. }
| NonMinimalVarInt | NonMinimalVarInt
@ -106,13 +101,6 @@ impl From<io::Error> for Error {
} }
} }
#[doc(hidden)]
impl From<psbt::Error> for Error {
fn from(e: psbt::Error) -> Error {
Error::Psbt(e)
}
}
/// Encodes an object into a vector. /// Encodes an object into a vector.
pub fn serialize<T: Encodable + ?Sized>(data: &T) -> Vec<u8> { pub fn serialize<T: Encodable + ?Sized>(data: &T) -> Vec<u8> {
let mut encoder = Vec::new(); let mut encoder = Vec::new();

View File

@ -372,7 +372,6 @@ enum DecodeError<E> {
fn consensus_error_into_serde<E: serde::de::Error>(error: ConsensusError) -> E { fn consensus_error_into_serde<E: serde::de::Error>(error: ConsensusError) -> E {
match error { match error {
ConsensusError::Io(error) => panic!("unexpected IO error {:?}", error), ConsensusError::Io(error) => panic!("unexpected IO error {:?}", error),
ConsensusError::Psbt(_) => panic!("PSBT shouldn't implement consensus encoding"),
ConsensusError::OversizedVectorAllocation { requested, max } => E::custom(format_args!("the requested allocation of {} items exceeds maximum of {}", requested, max)), ConsensusError::OversizedVectorAllocation { requested, max } => E::custom(format_args!("the requested allocation of {} items exceeds maximum of {}", requested, max)),
ConsensusError::InvalidChecksum { expected, actual } => E::invalid_value(Unexpected::Bytes(&actual), &DisplayExpected(format_args!("checksum {:02x}{:02x}{:02x}{:02x}", expected[0], expected[1], expected[2], expected[3]))), ConsensusError::InvalidChecksum { expected, actual } => E::invalid_value(Unexpected::Bytes(&actual), &DisplayExpected(format_args!("checksum {:02x}{:02x}{:02x}{:02x}", expected[0], expected[1], expected[2], expected[3]))),
ConsensusError::NonMinimalVarInt => E::custom(format_args!("compact size was not encoded minimally")), ConsensusError::NonMinimalVarInt => E::custom(format_args!("compact size was not encoded minimally")),

View File

@ -72,6 +72,8 @@ pub enum Error {
/// Conflicting data during combine procedure: /// Conflicting data during combine procedure:
/// global extended public key has inconsistent key sources /// global extended public key has inconsistent key sources
CombineInconsistentKeySources(Box<ExtendedPubKey>), CombineInconsistentKeySources(Box<ExtendedPubKey>),
/// Parsing error.
ParseFailed(&'static str),
/// Serialization error in bitcoin consensus-encoded structures /// Serialization error in bitcoin consensus-encoded structures
ConsensusEncoding, ConsensusEncoding,
/// Negative fee /// Negative fee
@ -104,6 +106,7 @@ impl fmt::Display for Error {
write!(f, "Preimage {:?} does not match {:?} hash {:?}", preimage, hash_type, hash ) write!(f, "Preimage {:?} does not match {:?} hash {:?}", preimage, hash_type, hash )
}, },
Error::CombineInconsistentKeySources(ref s) => { write!(f, "combine conflict: {}", s) }, Error::CombineInconsistentKeySources(ref s) => { write!(f, "combine conflict: {}", s) },
Error::ParseFailed(ref s) => write!(f, "parse failed: {}", s),
Error::ConsensusEncoding => f.write_str("bitcoin consensus or BIP-174 encoding error"), Error::ConsensusEncoding => f.write_str("bitcoin consensus or BIP-174 encoding error"),
Error::NegativeFee => f.write_str("PSBT has a negative fee which is not allowed"), Error::NegativeFee => f.write_str("PSBT has a negative fee which is not allowed"),
Error::FeeOverflow => f.write_str("integer overflow in fee calculation"), Error::FeeOverflow => f.write_str("integer overflow in fee calculation"),
@ -135,6 +138,7 @@ impl std::error::Error for Error {
| NonStandardSighashType(_) | NonStandardSighashType(_)
| InvalidPreimageHashPair{ .. } | InvalidPreimageHashPair{ .. }
| CombineInconsistentKeySources(_) | CombineInconsistentKeySources(_)
| ParseFailed(_)
| ConsensusEncoding | ConsensusEncoding
| NegativeFee | NegativeFee
| FeeOverflow => None, | FeeOverflow => None,
@ -150,10 +154,7 @@ impl From<hashes::Error> for Error {
} }
impl From<encode::Error> for Error { impl From<encode::Error> for Error {
fn from(err: encode::Error) -> Self { fn from(_: encode::Error) -> Self {
match err { Error::ConsensusEncoding
encode::Error::Psbt(err) => err,
_ => Error::ConsensusEncoding,
}
} }
} }

View File

@ -23,8 +23,8 @@ macro_rules! impl_psbt_de_serialize {
macro_rules! impl_psbt_deserialize { macro_rules! impl_psbt_deserialize {
($thing:ty) => { ($thing:ty) => {
impl $crate::psbt::serialize::Deserialize for $thing { impl $crate::psbt::serialize::Deserialize for $thing {
fn deserialize(bytes: &[u8]) -> Result<Self, $crate::consensus::encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, $crate::psbt::Error> {
$crate::consensus::deserialize(&bytes[..]) $crate::consensus::deserialize(&bytes[..]).map_err(|e| $crate::psbt::Error::from(e))
} }
} }
}; };
@ -53,7 +53,7 @@ macro_rules! impl_psbtmap_serialize {
macro_rules! impl_psbtmap_deserialize { macro_rules! impl_psbtmap_deserialize {
($thing:ty) => { ($thing:ty) => {
impl $crate::psbt::serialize::Deserialize for $thing { impl $crate::psbt::serialize::Deserialize for $thing {
fn deserialize(bytes: &[u8]) -> Result<Self, $crate::consensus::encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, $crate::psbt::Error> {
let mut decoder = bytes; let mut decoder = bytes;
Self::decode(&mut decoder) Self::decode(&mut decoder)
} }
@ -66,13 +66,13 @@ macro_rules! impl_psbtmap_decoding {
impl $thing { impl $thing {
pub(crate) fn decode<R: $crate::io::Read + ?Sized>( pub(crate) fn decode<R: $crate::io::Read + ?Sized>(
r: &mut R, r: &mut R,
) -> Result<Self, $crate::consensus::encode::Error> { ) -> Result<Self, $crate::psbt::Error> {
let mut rv: Self = core::default::Default::default(); let mut rv: Self = core::default::Default::default();
loop { loop {
match $crate::psbt::raw::Pair::decode(r) { match $crate::psbt::raw::Pair::decode(r) {
Ok(pair) => rv.insert_pair(pair)?, Ok(pair) => rv.insert_pair(pair)?,
Err($crate::consensus::encode::Error::Psbt($crate::psbt::Error::NoMorePairs)) => return Ok(rv), Err($crate::psbt::Error::NoMorePairs) => return Ok(rv),
Err(e) => return Err(e), Err(e) => return Err(e),
} }
} }
@ -156,9 +156,9 @@ macro_rules! impl_psbt_hash_de_serialize {
macro_rules! impl_psbt_hash_deserialize { macro_rules! impl_psbt_hash_deserialize {
($hash_type:ty) => { ($hash_type:ty) => {
impl $crate::psbt::serialize::Deserialize for $hash_type { impl $crate::psbt::serialize::Deserialize for $hash_type {
fn deserialize(bytes: &[u8]) -> Result<Self, $crate::consensus::encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, $crate::psbt::Error> {
<$hash_type>::from_slice(&bytes[..]).map_err(|e| { <$hash_type>::from_slice(&bytes[..]).map_err(|e| {
$crate::psbt::Error::from(e).into() $crate::psbt::Error::from(e)
}) })
} }
} }

View File

@ -88,7 +88,7 @@ impl Map for PartiallySignedTransaction {
} }
impl PartiallySignedTransaction { impl PartiallySignedTransaction {
pub(crate) fn decode_global<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, encode::Error> { pub(crate) fn decode_global<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, Error> {
let mut r = r.take(MAX_VEC_SIZE as u64); let mut r = r.take(MAX_VEC_SIZE as u64);
let mut tx: Option<Transaction> = None; let mut tx: Option<Transaction> = None;
let mut version: Option<u32> = None; let mut version: Option<u32> = None;
@ -119,30 +119,30 @@ impl PartiallySignedTransaction {
}); });
if decoder.position() != vlen as u64 { if decoder.position() != vlen as u64 {
return Err(encode::Error::ParseFailed("data not consumed entirely when explicitly deserializing")) return Err(Error::ParseFailed("data not consumed entirely when explicitly deserializing"))
} }
} else { } else {
return Err(Error::DuplicateKey(pair.key).into()) return Err(Error::DuplicateKey(pair.key))
} }
} else { } else {
return Err(Error::InvalidKey(pair.key).into()) return Err(Error::InvalidKey(pair.key))
} }
} }
PSBT_GLOBAL_XPUB => { PSBT_GLOBAL_XPUB => {
if !pair.key.key.is_empty() { if !pair.key.key.is_empty() {
let xpub = ExtendedPubKey::decode(&pair.key.key) let xpub = ExtendedPubKey::decode(&pair.key.key)
.map_err(|_| encode::Error::ParseFailed( .map_err(|_| Error::ParseFailed(
"Can't deserialize ExtendedPublicKey from global XPUB key data" "Can't deserialize ExtendedPublicKey from global XPUB key data"
))?; ))?;
if pair.value.is_empty() || pair.value.len() % 4 != 0 { if pair.value.is_empty() || pair.value.len() % 4 != 0 {
return Err(encode::Error::ParseFailed("Incorrect length of global xpub derivation data")) return Err(Error::ParseFailed("Incorrect length of global xpub derivation data"))
} }
let child_count = pair.value.len() / 4 - 1; let child_count = pair.value.len() / 4 - 1;
let mut decoder = Cursor::new(pair.value); let mut decoder = Cursor::new(pair.value);
let mut fingerprint = [0u8; 4]; let mut fingerprint = [0u8; 4];
decoder.read_exact(&mut fingerprint[..])?; decoder.read_exact(&mut fingerprint[..]).map_err(|_| Error::ParseFailed("Can't read global xpub fingerprint"))?;
let mut path = Vec::<ChildNumber>::with_capacity(child_count); let mut path = Vec::<ChildNumber>::with_capacity(child_count);
while let Ok(index) = u32::consensus_decode(&mut decoder) { while let Ok(index) = u32::consensus_decode(&mut decoder) {
path.push(ChildNumber::from(index)) path.push(ChildNumber::from(index))
@ -150,10 +150,10 @@ impl PartiallySignedTransaction {
let derivation = DerivationPath::from(path); let derivation = DerivationPath::from(path);
// Keys, according to BIP-174, must be unique // Keys, according to BIP-174, must be unique
if xpub_map.insert(xpub, (Fingerprint::from(fingerprint), derivation)).is_some() { if xpub_map.insert(xpub, (Fingerprint::from(fingerprint), derivation)).is_some() {
return Err(encode::Error::ParseFailed("Repeated global xpub key")) return Err(Error::ParseFailed("Repeated global xpub key"))
} }
} else { } else {
return Err(encode::Error::ParseFailed("Xpub global key must contain serialized Xpub data")) return Err(Error::ParseFailed("Xpub global key must contain serialized Xpub data"))
} }
} }
PSBT_GLOBAL_VERSION => { PSBT_GLOBAL_VERSION => {
@ -164,36 +164,36 @@ impl PartiallySignedTransaction {
let vlen: usize = pair.value.len(); let vlen: usize = pair.value.len();
let mut decoder = Cursor::new(pair.value); let mut decoder = Cursor::new(pair.value);
if vlen != 4 { if vlen != 4 {
return Err(encode::Error::ParseFailed("Wrong global version value length (must be 4 bytes)")) return Err(Error::ParseFailed("Wrong global version value length (must be 4 bytes)"))
} }
version = Some(Decodable::consensus_decode(&mut decoder)?); version = Some(Decodable::consensus_decode(&mut decoder)?);
// We only understand version 0 PSBTs. According to BIP-174 we // We only understand version 0 PSBTs. According to BIP-174 we
// should throw an error if we see anything other than version 0. // should throw an error if we see anything other than version 0.
if version != Some(0) { if version != Some(0) {
return Err(encode::Error::ParseFailed("PSBT versions greater than 0 are not supported")) return Err(Error::ParseFailed("PSBT versions greater than 0 are not supported"))
} }
} else { } else {
return Err(Error::DuplicateKey(pair.key).into()) return Err(Error::DuplicateKey(pair.key))
} }
} else { } else {
return Err(Error::InvalidKey(pair.key).into()) return Err(Error::InvalidKey(pair.key))
} }
} }
PSBT_GLOBAL_PROPRIETARY => match proprietary.entry(raw::ProprietaryKey::try_from(pair.key.clone())?) { PSBT_GLOBAL_PROPRIETARY => match proprietary.entry(raw::ProprietaryKey::try_from(pair.key.clone())?) {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(pair.value); empty_key.insert(pair.value);
}, },
btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(pair.key).into()), btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(pair.key)),
} }
_ => match unknowns.entry(pair.key) { _ => match unknowns.entry(pair.key) {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(pair.value); empty_key.insert(pair.value);
}, },
btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone()).into()), btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone())),
} }
} }
} }
Err(crate::consensus::encode::Error::Psbt(crate::psbt::Error::NoMorePairs)) => break, Err(crate::psbt::Error::NoMorePairs) => break,
Err(e) => return Err(e), Err(e) => return Err(e),
} }
} }
@ -209,7 +209,7 @@ impl PartiallySignedTransaction {
outputs: vec![] outputs: vec![]
}) })
} else { } else {
Err(Error::MustHaveUnsignedTx.into()) Err(Error::MustHaveUnsignedTx)
} }
} }
} }

View File

@ -11,7 +11,6 @@ use secp256k1::XOnlyPublicKey;
use crate::blockdata::script::ScriptBuf; use crate::blockdata::script::ScriptBuf;
use crate::blockdata::witness::Witness; use crate::blockdata::witness::Witness;
use crate::blockdata::transaction::{Transaction, TxOut}; use crate::blockdata::transaction::{Transaction, TxOut};
use crate::consensus::encode;
use crate::crypto::{ecdsa, schnorr}; use crate::crypto::{ecdsa, schnorr};
use crate::crypto::key::PublicKey; use crate::crypto::key::PublicKey;
use crate::hashes::{self, hash160, ripemd160, sha256, sha256d}; use crate::hashes::{self, hash160, ripemd160, sha256, sha256d};
@ -247,7 +246,7 @@ impl Input {
.unwrap_or(Ok(SchnorrSighashType::Default)) .unwrap_or(Ok(SchnorrSighashType::Default))
} }
pub(super) fn insert_pair(&mut self, pair: raw::Pair) -> Result<(), encode::Error> { pub(super) fn insert_pair(&mut self, pair: raw::Pair) -> Result<(), Error> {
let raw::Pair { let raw::Pair {
key: raw_key, key: raw_key,
value: raw_value, value: raw_value,
@ -347,14 +346,14 @@ impl Input {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(raw_value); empty_key.insert(raw_value);
}, },
btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(raw_key).into()), btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(raw_key)),
} }
} }
_ => match self.unknown.entry(raw_key) { _ => match self.unknown.entry(raw_key) {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(raw_value); empty_key.insert(raw_value);
} }
btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone()).into()), btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone())),
}, },
} }
@ -496,12 +495,12 @@ fn psbt_insert_hash_pair<H>(
raw_key: raw::Key, raw_key: raw::Key,
raw_value: Vec<u8>, raw_value: Vec<u8>,
hash_type: error::PsbtHash, hash_type: error::PsbtHash,
) -> Result<(), encode::Error> ) -> Result<(), Error>
where where
H: hashes::Hash + Deserialize, H: hashes::Hash + Deserialize,
{ {
if raw_key.key.is_empty() { if raw_key.key.is_empty() {
return Err(psbt::Error::InvalidKey(raw_key).into()); return Err(psbt::Error::InvalidKey(raw_key));
} }
let key_val: H = Deserialize::deserialize(&raw_key.key)?; let key_val: H = Deserialize::deserialize(&raw_key.key)?;
match map.entry(key_val) { match map.entry(key_val) {
@ -512,13 +511,12 @@ where
preimage: val.into_boxed_slice(), preimage: val.into_boxed_slice(),
hash: Box::from(key_val.borrow()), hash: Box::from(key_val.borrow()),
hash_type, hash_type,
} });
.into());
} }
empty_key.insert(val); empty_key.insert(val);
Ok(()) Ok(())
} }
btree_map::Entry::Occupied(_) => Err(psbt::Error::DuplicateKey(raw_key).into()), btree_map::Entry::Occupied(_) => Err(psbt::Error::DuplicateKey(raw_key)),
} }
} }

View File

@ -5,7 +5,6 @@ use core;
use core::convert::TryFrom; use core::convert::TryFrom;
use crate::blockdata::script::ScriptBuf; use crate::blockdata::script::ScriptBuf;
use crate::consensus::encode;
use secp256k1::XOnlyPublicKey; use secp256k1::XOnlyPublicKey;
use crate::bip32::KeySource; use crate::bip32::KeySource;
use secp256k1; use secp256k1;
@ -205,7 +204,7 @@ impl<'tree> Iterator for TapTreeIter<'tree> {
} }
impl Output { impl Output {
pub(super) fn insert_pair(&mut self, pair: raw::Pair) -> Result<(), encode::Error> { pub(super) fn insert_pair(&mut self, pair: raw::Pair) -> Result<(), Error> {
let raw::Pair { let raw::Pair {
key: raw_key, key: raw_key,
value: raw_value, value: raw_value,
@ -233,7 +232,7 @@ impl Output {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(raw_value); empty_key.insert(raw_value);
}, },
btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(raw_key).into()), btree_map::Entry::Occupied(_) => return Err(Error::DuplicateKey(raw_key)),
} }
} }
PSBT_OUT_TAP_INTERNAL_KEY => { PSBT_OUT_TAP_INTERNAL_KEY => {
@ -255,7 +254,7 @@ impl Output {
btree_map::Entry::Vacant(empty_key) => { btree_map::Entry::Vacant(empty_key) => {
empty_key.insert(raw_value); empty_key.insert(raw_value);
} }
btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone()).into()), btree_map::Entry::Occupied(k) => return Err(Error::DuplicateKey(k.key().clone())),
} }
} }

View File

@ -745,10 +745,9 @@ impl From<ecdsa::Error> for SignError {
#[cfg(feature = "base64")] #[cfg(feature = "base64")]
mod display_from_str { mod display_from_str {
use super::PartiallySignedTransaction; use super::{PartiallySignedTransaction, Error};
use core::fmt::{Display, Formatter, self}; use core::fmt::{Display, Formatter, self};
use core::str::FromStr; use core::str::FromStr;
use crate::consensus::encode::Error;
use base64::display::Base64Display; use base64::display::Base64Display;
use bitcoin_internals::write_err; use bitcoin_internals::write_err;
@ -1396,9 +1395,9 @@ mod tests {
assert_eq!(err.to_string(), "parse failed: Invalid xonly public key"); assert_eq!(err.to_string(), "parse failed: Invalid xonly public key");
let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b6924214022cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b094089756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb0000").unwrap_err(); let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b6924214022cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b094089756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb0000").unwrap_err();
#[cfg(feature = "std")] #[cfg(feature = "std")]
assert_eq!(err.to_string(), "PSBT error"); assert_eq!(err.to_string(), "hash parse error");
#[cfg(not(feature = "std"))] #[cfg(not(feature = "std"))]
assert_eq!(err.to_string(), "PSBT error: hash parse error: bad slice length 33 (expected 32)"); assert_eq!(err.to_string(), "hash parse error: bad slice length 33 (expected 32)");
let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b69241142cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b094289756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb01010000").unwrap_err(); let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b69241142cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b094289756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb01010000").unwrap_err();
assert_eq!(err.to_string(), "parse failed: Invalid Schnorr signature length"); assert_eq!(err.to_string(), "parse failed: Invalid Schnorr signature length");
let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b69241142cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b093989756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb0000").unwrap_err(); let err = hex_psbt!("70736274ff01005e02000000019bd48765230bf9a72e662001f972556e54f0c6f97feb56bcb5600d817f6995260100000000ffffffff0148e6052a01000000225120030da4fce4f7db28c2cb2951631e003713856597fe963882cb500e68112cca63000000000001012b00f2052a01000000225120c2247efbfd92ac47f6f40b8d42d169175a19fa9fa10e4a25d7f35eb4dd85b69241142cb13ac68248de806aa6a3659cf3c03eb6821d09c8114a4e868febde865bb6d2cd970e15f53fc0c82f950fd560ffa919b76172be017368a89913af074f400b093989756aa3739ccc689ec0fcf3a360be32cc0b59b16e93a1e8bb4605726b2ca7a3ff706c4176649632b2cc68e1f912b8a578e3719ce7710885c7a966f49bcd43cb0000").unwrap_err();

View File

@ -70,11 +70,11 @@ impl fmt::Display for Key {
} }
impl Key { impl Key {
pub(crate) fn decode<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, encode::Error> { pub(crate) fn decode<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, Error> {
let VarInt(byte_size): VarInt = Decodable::consensus_decode(r)?; let VarInt(byte_size): VarInt = Decodable::consensus_decode(r)?;
if byte_size == 0 { if byte_size == 0 {
return Err(Error::NoMorePairs.into()); return Err(Error::NoMorePairs);
} }
let key_byte_size: u64 = byte_size - 1; let key_byte_size: u64 = byte_size - 1;
@ -83,7 +83,7 @@ impl Key {
return Err(encode::Error::OversizedVectorAllocation { return Err(encode::Error::OversizedVectorAllocation {
requested: key_byte_size as usize, requested: key_byte_size as usize,
max: MAX_VEC_SIZE, max: MAX_VEC_SIZE,
}) })?
} }
let type_value: u8 = Decodable::consensus_decode(r)?; let type_value: u8 = Decodable::consensus_decode(r)?;
@ -123,14 +123,14 @@ impl Serialize for Pair {
} }
impl Deserialize for Pair { impl Deserialize for Pair {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
let mut decoder = bytes; let mut decoder = bytes;
Pair::decode(&mut decoder) Pair::decode(&mut decoder)
} }
} }
impl Pair { impl Pair {
pub(crate) fn decode<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, encode::Error> { pub(crate) fn decode<R: io::Read + ?Sized>(r: &mut R) -> Result<Self, Error> {
Ok(Pair { Ok(Pair {
key: Key::decode(r)?, key: Key::decode(r)?,
value: Decodable::consensus_decode(r)?, value: Decodable::consensus_decode(r)?,

View File

@ -20,7 +20,7 @@ use secp256k1::{self, XOnlyPublicKey};
use crate::bip32::{ChildNumber, Fingerprint, KeySource}; use crate::bip32::{ChildNumber, Fingerprint, KeySource};
use crate::hashes::{hash160, ripemd160, sha256, sha256d, Hash}; use crate::hashes::{hash160, ripemd160, sha256, sha256d, Hash};
use crate::crypto::{ecdsa, schnorr}; use crate::crypto::{ecdsa, schnorr};
use crate::psbt::{self, Error, PartiallySignedTransaction}; use crate::psbt::{Error, PartiallySignedTransaction};
use crate::taproot::{TapNodeHash, TapLeafHash, ControlBlock, LeafVersion}; use crate::taproot::{TapNodeHash, TapLeafHash, ControlBlock, LeafVersion};
use crate::crypto::key::PublicKey; use crate::crypto::key::PublicKey;
@ -38,7 +38,7 @@ pub(crate) trait Serialize {
/// A trait for deserializing a value from raw data in PSBT key-value maps. /// A trait for deserializing a value from raw data in PSBT key-value maps.
pub(crate) trait Deserialize: Sized { pub(crate) trait Deserialize: Sized {
/// Deserialize a value from raw data. /// Deserialize a value from raw data.
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error>; fn deserialize(bytes: &[u8]) -> Result<Self, Error>;
} }
impl PartiallySignedTransaction { impl PartiallySignedTransaction {
@ -71,15 +71,15 @@ impl PartiallySignedTransaction {
/// Deserialize a value from raw binary data. /// Deserialize a value from raw binary data.
pub fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { pub fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
const MAGIC_BYTES: &[u8] = b"psbt"; const MAGIC_BYTES: &[u8] = b"psbt";
if bytes.get(0..MAGIC_BYTES.len()) != Some(MAGIC_BYTES) { if bytes.get(0..MAGIC_BYTES.len()) != Some(MAGIC_BYTES) {
return Err(Error::InvalidMagic.into()); return Err(Error::InvalidMagic);
} }
const PSBT_SERPARATOR: u8 = 0xff_u8; const PSBT_SERPARATOR: u8 = 0xff_u8;
if bytes.get(MAGIC_BYTES.len()) != Some(&PSBT_SERPARATOR) { if bytes.get(MAGIC_BYTES.len()) != Some(&PSBT_SERPARATOR) {
return Err(Error::InvalidSeparator.into()); return Err(Error::InvalidSeparator);
} }
let mut d = bytes.get(5..).ok_or(Error::NoMorePairs)?; let mut d = bytes.get(5..).ok_or(Error::NoMorePairs)?;
@ -136,7 +136,7 @@ impl Serialize for ScriptBuf {
} }
impl Deserialize for ScriptBuf { impl Deserialize for ScriptBuf {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
Ok(Self::from(bytes.to_vec())) Ok(Self::from(bytes.to_vec()))
} }
} }
@ -150,9 +150,9 @@ impl Serialize for PublicKey {
} }
impl Deserialize for PublicKey { impl Deserialize for PublicKey {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
PublicKey::from_slice(bytes) PublicKey::from_slice(bytes)
.map_err(|_| encode::Error::ParseFailed("invalid public key")) .map_err(|_| Error::ParseFailed("invalid public key"))
} }
} }
@ -163,9 +163,9 @@ impl Serialize for secp256k1::PublicKey {
} }
impl Deserialize for secp256k1::PublicKey { impl Deserialize for secp256k1::PublicKey {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
secp256k1::PublicKey::from_slice(bytes) secp256k1::PublicKey::from_slice(bytes)
.map_err(|_| encode::Error::ParseFailed("invalid public key")) .map_err(|_| Error::ParseFailed("invalid public key"))
} }
} }
@ -176,7 +176,7 @@ impl Serialize for ecdsa::Signature {
} }
impl Deserialize for ecdsa::Signature { impl Deserialize for ecdsa::Signature {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
// NB: Since BIP-174 says "the signature as would be pushed to the stack from // NB: Since BIP-174 says "the signature as would be pushed to the stack from
// a scriptSig or witness" we should ideally use a consensus deserialization and do // a scriptSig or witness" we should ideally use a consensus deserialization and do
// not error on a non-standard values. However, // not error on a non-standard values. However,
@ -193,13 +193,13 @@ impl Deserialize for ecdsa::Signature {
ecdsa::Signature::from_slice(bytes) ecdsa::Signature::from_slice(bytes)
.map_err(|e| match e { .map_err(|e| match e {
ecdsa::Error::EmptySignature => { ecdsa::Error::EmptySignature => {
encode::Error::ParseFailed("Empty partial signature data") Error::ParseFailed("Empty partial signature data")
} }
ecdsa::Error::NonStandardSighashType(flag) => { ecdsa::Error::NonStandardSighashType(flag) => {
encode::Error::from(psbt::Error::NonStandardSighashType(flag)) Error::NonStandardSighashType(flag)
} }
ecdsa::Error::Secp256k1(..) => { ecdsa::Error::Secp256k1(..) => {
encode::Error::ParseFailed("Invalid Ecdsa signature") Error::ParseFailed("Invalid Ecdsa signature")
} }
ecdsa::Error::HexEncoding(..) => { ecdsa::Error::HexEncoding(..) => {
unreachable!("Decoding from slice, not hex") unreachable!("Decoding from slice, not hex")
@ -223,9 +223,9 @@ impl Serialize for KeySource {
} }
impl Deserialize for KeySource { impl Deserialize for KeySource {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
if bytes.len() < 4 { if bytes.len() < 4 {
return Err(io::Error::from(io::ErrorKind::UnexpectedEof).into()) return Err(encode::Error::from(io::Error::from(io::ErrorKind::UnexpectedEof)).into())
} }
let fprint: Fingerprint = bytes[0..4].try_into().expect("4 is the fingerprint length"); let fprint: Fingerprint = bytes[0..4].try_into().expect("4 is the fingerprint length");
@ -235,7 +235,7 @@ impl Deserialize for KeySource {
while !d.is_empty() { while !d.is_empty() {
match u32::consensus_decode(&mut d) { match u32::consensus_decode(&mut d) {
Ok(index) => dpath.push(index.into()), Ok(index) => dpath.push(index.into()),
Err(e) => return Err(e), Err(e) => return Err(e)?,
} }
} }
@ -251,7 +251,7 @@ impl Serialize for Vec<u8> {
} }
impl Deserialize for Vec<u8> { impl Deserialize for Vec<u8> {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
Ok(bytes.to_vec()) Ok(bytes.to_vec())
} }
} }
@ -263,7 +263,7 @@ impl Serialize for PsbtSighashType {
} }
impl Deserialize for PsbtSighashType { impl Deserialize for PsbtSighashType {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
let raw: u32 = encode::deserialize(bytes)?; let raw: u32 = encode::deserialize(bytes)?;
Ok(PsbtSighashType { inner: raw }) Ok(PsbtSighashType { inner: raw })
} }
@ -277,9 +277,9 @@ impl Serialize for XOnlyPublicKey {
} }
impl Deserialize for XOnlyPublicKey { impl Deserialize for XOnlyPublicKey {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
XOnlyPublicKey::from_slice(bytes) XOnlyPublicKey::from_slice(bytes)
.map_err(|_| encode::Error::ParseFailed("Invalid xonly public key")) .map_err(|_| Error::ParseFailed("Invalid xonly public key"))
} }
} }
@ -290,17 +290,17 @@ impl Serialize for schnorr::Signature {
} }
impl Deserialize for schnorr::Signature { impl Deserialize for schnorr::Signature {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
schnorr::Signature::from_slice(bytes) schnorr::Signature::from_slice(bytes)
.map_err(|e| match e { .map_err(|e| match e {
schnorr::Error::InvalidSighashType(flag) => { schnorr::Error::InvalidSighashType(flag) => {
encode::Error::from(psbt::Error::NonStandardSighashType(flag as u32)) Error::NonStandardSighashType(flag as u32)
} }
schnorr::Error::InvalidSignatureSize(_) => { schnorr::Error::InvalidSignatureSize(_) => {
encode::Error::ParseFailed("Invalid Schnorr signature length") Error::ParseFailed("Invalid Schnorr signature length")
} }
schnorr::Error::Secp256k1(..) => { schnorr::Error::Secp256k1(..) => {
encode::Error::ParseFailed("Invalid Schnorr signature") Error::ParseFailed("Invalid Schnorr signature")
} }
}) })
} }
@ -317,9 +317,9 @@ impl Serialize for (XOnlyPublicKey, TapLeafHash) {
} }
impl Deserialize for (XOnlyPublicKey, TapLeafHash) { impl Deserialize for (XOnlyPublicKey, TapLeafHash) {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
if bytes.len() < 32 { if bytes.len() < 32 {
return Err(io::Error::from(io::ErrorKind::UnexpectedEof).into()) return Err(encode::Error::from(io::Error::from(io::ErrorKind::UnexpectedEof)).into())
} }
let a: XOnlyPublicKey = Deserialize::deserialize(&bytes[..32])?; let a: XOnlyPublicKey = Deserialize::deserialize(&bytes[..32])?;
let b: TapLeafHash = Deserialize::deserialize(&bytes[32..])?; let b: TapLeafHash = Deserialize::deserialize(&bytes[32..])?;
@ -334,9 +334,9 @@ impl Serialize for ControlBlock {
} }
impl Deserialize for ControlBlock { impl Deserialize for ControlBlock {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
Self::from_slice(bytes) Self::from_slice(bytes)
.map_err(|_| encode::Error::ParseFailed("Invalid control block")) .map_err(|_| Error::ParseFailed("Invalid control block"))
} }
} }
@ -351,14 +351,14 @@ impl Serialize for (ScriptBuf, LeafVersion) {
} }
impl Deserialize for (ScriptBuf, LeafVersion) { impl Deserialize for (ScriptBuf, LeafVersion) {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
if bytes.is_empty() { if bytes.is_empty() {
return Err(io::Error::from(io::ErrorKind::UnexpectedEof).into()) return Err(encode::Error::from(io::Error::from(io::ErrorKind::UnexpectedEof)).into())
} }
// The last byte is LeafVersion. // The last byte is LeafVersion.
let script = ScriptBuf::deserialize(&bytes[..bytes.len() - 1])?; let script = ScriptBuf::deserialize(&bytes[..bytes.len() - 1])?;
let leaf_ver = LeafVersion::from_consensus(bytes[bytes.len() - 1]) let leaf_ver = LeafVersion::from_consensus(bytes[bytes.len() - 1])
.map_err(|_| encode::Error::ParseFailed("invalid leaf version"))?; .map_err(|_| Error::ParseFailed("invalid leaf version"))?;
Ok((script, leaf_ver)) Ok((script, leaf_ver))
} }
} }
@ -375,7 +375,7 @@ impl Serialize for (Vec<TapLeafHash>, KeySource) {
} }
impl Deserialize for (Vec<TapLeafHash>, KeySource) { impl Deserialize for (Vec<TapLeafHash>, KeySource) {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
let (leafhash_vec, consumed) = deserialize_partial::<Vec<TapLeafHash>>(bytes)?; let (leafhash_vec, consumed) = deserialize_partial::<Vec<TapLeafHash>>(bytes)?;
let key_source = KeySource::deserialize(&bytes[consumed..])?; let key_source = KeySource::deserialize(&bytes[consumed..])?;
Ok((leafhash_vec, key_source)) Ok((leafhash_vec, key_source))
@ -405,25 +405,25 @@ impl Serialize for TapTree {
} }
impl Deserialize for TapTree { impl Deserialize for TapTree {
fn deserialize(bytes: &[u8]) -> Result<Self, encode::Error> { fn deserialize(bytes: &[u8]) -> Result<Self, Error> {
let mut builder = TaprootBuilder::new(); let mut builder = TaprootBuilder::new();
let mut bytes_iter = bytes.iter(); let mut bytes_iter = bytes.iter();
while let Some(depth) = bytes_iter.next() { while let Some(depth) = bytes_iter.next() {
let version = bytes_iter.next().ok_or(encode::Error::ParseFailed("Invalid Taproot Builder"))?; let version = bytes_iter.next().ok_or(Error::ParseFailed("Invalid Taproot Builder"))?;
let (script, consumed) = deserialize_partial::<ScriptBuf>(bytes_iter.as_slice())?; let (script, consumed) = deserialize_partial::<ScriptBuf>(bytes_iter.as_slice())?;
if consumed > 0 { if consumed > 0 {
bytes_iter.nth(consumed - 1); bytes_iter.nth(consumed - 1);
} }
let leaf_version = LeafVersion::from_consensus(*version) let leaf_version = LeafVersion::from_consensus(*version)
.map_err(|_| encode::Error::ParseFailed("Leaf Version Error"))?; .map_err(|_| Error::ParseFailed("Leaf Version Error"))?;
builder = builder.add_leaf_with_ver(*depth, script, leaf_version) builder = builder.add_leaf_with_ver(*depth, script, leaf_version)
.map_err(|_| encode::Error::ParseFailed("Tree not in DFS order"))?; .map_err(|_| Error::ParseFailed("Tree not in DFS order"))?;
} }
if builder.is_finalizable() && !builder.has_hidden_nodes() { if builder.is_finalizable() && !builder.has_hidden_nodes() {
Ok(TapTree(builder)) Ok(TapTree(builder))
} else { } else {
Err(encode::Error::ParseFailed("Incomplete taproot Tree")) Err(Error::ParseFailed("Incomplete taproot Tree"))
} }
} }
} }